Discovering the Keys to Running Analytics in Real-Time

The onslaught of fast data is growing in size, complexity, and speed, spurred by increasing business demands along with the rise of the Internet of Things. Because of this, operationalizing insights at the point-of-action has become a top priority. New technologies are coming to the forefront to facilitate real-time analytics, including in-memory platforms, self-service BI tools, and all-flash storage arrays.

DBTA held a recent webinar featuring Terry Walters, senior solution architect at Hazelcast; Brian Bulkowski, co-founder and CTO at Aerospike; and Kevin Petrie senior director and technology evangelist at Attunity, who discussed enabling technologies, important success factors, and real-world use cases for discovering the keys to real-time success.

According to Walters, a successful big and fast data stack contains big data with low latency, real-time latency, situational awareness, architectural simplicity, change data capture, and batch to real-time.

Hazelcast Jet is one of the many platforms that can help users achieve this goal, Walters explained. The platform includes real-time stream processing, disturbed, and data-processing microservices.

Bulkowski defined the keys to success as re-imagining data to fit your analytics, using modern, programmatic languages and tools, building for greater “front edge” data sizes, and building a hybrid transaction analytics processing architecture.

Aerospike platforms can assist users in reaching these milestones, Bulkowski explained, as hybrid memory architectures enable digital transformation in fundamentally different ways. They deliver simplicity, faster time to market, business agility, competitive advantage, and lowest TCO.

Modern fast data requires the ability to ingest and replicate data on demand, stream real-time, and load to any target, Petrie said.

Attunity Replicate is a solution that provides no manual coding or scripting, is automated end to end, and is optimized and configurable.

The platform can accelerate data delivery and availability, automate data readiness for analytics, and optimize data management with intelligence, Petrie explained.

An archived on-demand replay of this webinar is available here.