The Real-Time Analytics Race: Essential Platforms, Tools, and Strategies

Real-time analytics is a highly sought-after component of a truly digital enterprise, bringing information to the hands of people who need it, when they need it. However, delivering real-time analytics while mitigating current challenges—such as its volume, short shelf life, and incompatibility with legacy data infrastructures—is easier said than done.

Experts in real-time analytics joined DBTA’s webinar, Enabling Real-Time Data and Analytics: Key Considerations and Best Practices, to examine the platforms, tools, and strategies essential to enabling real-time analytics and its variety of enterprise-wide benefits.

Andy Connelly, strategy principal, strategic services at Precisely, divided the journey toward real-time analytics into four key areas to consider, which include:

  • Recessionary pressures—doing more with less
  • Business-first approach—connect data to value and outcomes
  • Data quality—both a key enabler and the biggest challenge
  • Operational efficiency—through automation and self-service

Recessionary pressures shape the holistic movement toward real-time analytics, where, across the board, enterprises will have to do more (or the same) with less resources. The latter three approaches—business-first, data quality, and operational efficiency—should accommodate those pressures.

A business-first approach, according to Connelly, requires a focus on outcomes and business drivers to effectively bridge the gap between data and business value. Important data and analytics should also be prioritized, along with building engagement and trust with business teams and incorporating data capabilities into business processes.

Traditional methods of data quality need an update, as they often fail to accommodate the speed and scale that real-time analytics necessitate. Instead, enterprises should focus on adopting data quality initiatives that are visual and intuitive, easy-to-use, and offer high reusability. This not only improves data confidence but also allows for deeper conversations with business stakeholders through understandable data.

Operational efficiency, Connelly explained, is driven through automation and self-service. Automation ultimately decreases the number of touches on the data while enhancing time-to-value. Furthermore, self-service feeds analytics with trustworthy data across an organization, optimizing operational efficiency.

Robin Peel, data management maestro, client success at Semarchy, argued that master data management (MDM) creates the foundation for real-time data and analytics. MDM ensures that the “best” version of any data originating from multiple sources is used, enabling everyone to trust the quality and reliability of the data being leveraged to create business value and make decisions.

Peel acknowledged that every company is unique and so are their needs, which ultimately determines the sort of strategy that will succeed at enabling real-time analytics. Semarchy—which provides an easy-to-use MDM solution—enables all four MDM styles for any domain. This includes:

  • Registry, for real-time central reference
  • Consolidation, for reporting, analytics, and central reference
  • Co-Existence, for harmonization across systems and for central reference
  • Centralized, a system of record for operational and analytical systems

Sida Shen, product marketing manager at CelerData, focused viewers’ attention on data modeling best practices, emphasizing the difference between normalization and denormalization.

There are three types of table schema: Snowflake schema, Star schema, and flat-table schema. In normalization, a flat-table schema is transformed into a Snowflake schema, eliminating redundancies in the data, simplifying the query process, and improving workflows. Denormalization does the opposite, where it eliminates JOINs at the cost of locking data into a single-view format with no flexibility to adapt to business changes.

This forces data practitioners to create complex preprocessing pipelines, since existing solutions are not optimized for multi-table JOINs.

Shen pointed to this simple solution: just make JOINs run faster!

With StarRocks, CelerData’s real-time OLAP database, enterprises can run on-the-fly JOIN queries at speed, eradicating the need for denormalization pipelines. StarRocks further allows enterprises to benefit from storage compute separated by design, real-time mutable data, no external dependency, and standard SQL compliance.

Amy Dickson, senior director, product management at Actian, boiled down data goals into four main categories:

  • Driving data reliability to foster a data-driven culture
  • Providing data quality and real-time insights for confident decision making
  • Empowering all skill-levels to transform data and drive data quality assurance
  • Reducing wasted time and manual effort with fast and easy data prep

However, data silos, data quality concerns, skills gaps, data latency, and cloud mandates prevent these goals from becoming a reality, especially regarding driving real-time analytics.

Dickson then presented the Actian Data Platform, a trusted, flexible, and easy-to-use data platform that helps transform business by simplifying how people connect, manage, and analyze data. By merging connection, management, and analysis within a single platform, Actian solves today’s greatest analytics challenges—including going real time.

The Actian Data Platform offers great flexibility and reliability—meeting enterprises where they are now, and in the future—while driving high performance at a low price point.

For an in-depth review of the latest platforms, solutions, and strategies for adopting real-time analytics, you can view an archived version of the webinar here.