Newsletters




How to Successfully Manage Fast Data with Real-Time Capabilities


Today, the average enterprise has data streaming into business-critical applications and systems from a plethora of endpoints, from smart devices and sensor networks, to web logs and financial transactions.

This onslaught of fast data is growing in size, complexity and speed, fueled by increasing business demands and the growth of the Internet of Things.

The ability to quickly act on information to solve problems or create business value has long been the goal of many businesses. However, it wasn't until recently when new technologies emerged that the speed and scalability requirements of real-time data analysis could be addressed both technically and cost-effectively by organizations on a larger scale.

DBTA recently held a webinar with Richard Lewis, solution engineer, DataStax, and Steve Wilkes, founder and chief technology officer, Striim, who discussed key fast data solutions.

As Enterprises move their business-critical data to the cloud, hybrid cloud architecture, which supports an organization’s data in both public and private cloud environments, is being embraced as a strategic middle-ground for many organizations, Lewis explained.

In this world, companies need an operational data layer that is never down, has data distributed close to the end points, delivers response times in milli-seconds, and enables meaningful transactions that have context.

Key requirements for real-time success consist of the following principles:

Realize Value From Data:

  • Large Amounts of Data
  • Multiple Sources
  • High Rate of Ingest
  • Need to be able to take action as data ingested

Digital First

  • Younger Generations
  • Different Expectations

Personalized Experience

  • Prefer Applications
  • Intelligent Digital Personalization
  • Automated Support

Trust and loyalty are based on reduced failure and brick and mortar still plays a role for instant gratification, Lewis said.

DataStax Enterprise is the Active Everywhere Database built on Apache Cassandra’s masterless architecture that is helping companies modernize their operational database capabilities to optimize the value of their data and power mission-critical applications.

With a variety of deployment options and cloud capabilities, DataStax can put an organization on the path to better business outcomes, according to Lewis.

Streaming Integration has emerged as a major infrastructure requirement, Wilkes said. Streaming integration is the foundation of fast data.

Streaming data can handle extreme volumes of data at scale with high throughput; process, analyze, and correlate data in flight; and make data valuable, verifiable, and visible in real time, Wilkes explained.

According to Wilkes, Striim can provide continuous data collection and data delivery with its platform.

An archived on-demand replay of this webinar is available here.


Sponsors