Page 1 of 3 next >>

The State and Current Viability of Real-Time Analytics


The real-time revolution for enterprises has been underway for more than a decade, characterized by the gathering, processing, and delivering of near real-time (within hours) capabilities all the way to more instantaneous movement of data within seconds or subseconds.

While the technology has been around for some time, real-time analytics has lately come to the forefront as a critical com­ponent of burgeoning AI, edge, and digital commerce systems.

And there’s a twist as well. Data managers now prefer real-time analytical capabilities built within their applications and systems, rather than a separate, standalone, or bolted-on proj­ect. Interest in real-time analytics as a standalone effort has dropped from 50% to 32% during the past 2 years, a recent survey of 259 data managers conducted by Unisphere Research finds (2025 Market Study: “Modern Data Architecture in the AI Era,” Unisphere Research, July 2025).

The drop in real-time analytics projects “suggests that this capability is increasingly assessed as part of broader data archi­tectural strategies rather than as a standalone research priority,” according to the study’s author, John O’Brien, principal advisor with Radiant Advisors. This “reveals how profoundly the emer­gence of AI has altered long-term value calculations. In 2023, real-time analytics represented the dominant future value prop­osition, suggesting organizations believed competitive advan­tage would come from faster operational decision-making. The 2025 results indicate a fundamental strategic pivot, where orga­nizations now believe that competitive advantage will arise from intelligent systems capable of providing predictive insights and automated actions, rather than from simply faster human deci­sion-making processes.”

While introducing real-time analytics “used to require heavy custom engineering, it is now embedded into mainstream plat­forms, which makes real-time insights available to more organi­zations,” Michael Pytel, lead technologist at VASS North Amer­ica, confirmed. “Then, the increase of cloud-native architectures lowered the barrier to scale to receive more data more quickly and to have it analyzed for trends than ever before. Look at how quickly AI has been adopted into the day-to-day of employees over the past few years.”

ARE WE THERE YET?

So, the question becomes: Are real-time analytics ubiqui­tous to the point in which they are automatically integrated into any and all applications? By now, the use of real-time analyt­ics should be a “standard operating requirement” for customer experience, said Srini Srinivasan, founder and CTO at Aero­spike. This is where the rubber meets the road—where “the majority of the advances in real-time applications have been made in consumer-oriented enterprises,” he added.

Along these lines, the most prominent use cases for real-time analytics include “risk analysis, fraud detection, recommenda­tion engines, user-based dynamic pricing, dynamic billing and charging, and customer 360,” Srinivasan continued. “For over a decade, these systems have been using AI and machine learning [ML], inferencing for improving the quality of real-time deci­sions to improve customer experience at scale. The goal is to ensure that the first customer and the hundred-millionth cus­tomer have the same vitality of customer experience.”

There is a dimension beyond customer experience as well: Real-time analytics are part of operational technology systems that run industrial-level applications and systems. “Within industries such as energy, life sciences, and chemicals, the next decade of real-time analytics will be driven by more autono­mous operations,” said David Streit, VP, enterprise operations platform, at Emerson’s Aspen Technology business unit. “In the near future, real-time operational data will make AI assistants and agents even more accurate. They’ll be able to answer oper­ator questions like, ‘How can I improve throughput?’ not just based on what happened in the past, but what’s happening in the moment.”

The good news is that real-time analytics is becoming highly democratized. Such capabilities “used to be limited to com­panies with sophisticated tooling and talent,” said Sandeep Prakash, VP of product management at Pentaho. “Foundational infrastructure and skill challenges that were once extremely complex and expensive have largely been overcome.” Popular approaches to real-time analytics range from “running your own open-source options such as Apache Flink or Apache Kafka to cloud-native offerings.”

Technologies such as Apache Kafka “enable low-latency pipelines that operate within milliseconds, facilitating rapid data processing and transfer,” said Carlos Rolo, open-source contributions team lead at NetApp Instaclustr. “Additionally, high-volume databases like Apache Cassandra and ClickHouse cater to different workload needs. Cassandra excels in write-heavy scenarios, while ClickHouse is optimized for read-heavy tasks. These open-source technologies have made real-time data processing a viable and efficient option.”

Ultimately, real-time analytics is possible today for organiza­tions “that have conquered how to make data from across their business accessible and actionable,” said Streit.

Page 1 of 3 next >>


Newsletters

Subscribe to Big Data Quarterly E-Edition