Real-time data delivery represents the next frontier of intelligent enterprises, and there is great potential value in the ability to immediately sense and respond to opportunities and threats. At the same time, enterprises are encumbered by existing or legacy technologies and methodologies that may add latency to their data-delivery efforts.
What will a real-time enterprise look like? There will be many variations, but industry observers agree that data is at the core of all efforts, as it shapes the quality of decisions.
Most executives (77%) responding to a recent survey conducted by IDC in partnership with InterSystems agree that lack of timely access to data is inhibiting their businesses. This lag is also slowing the pace of business the survey finds, with 54% stating it limits operational efficiency, and 27% saying it has impacted productivity and agility. The survey also uncovers an increasing demand for reliable real-time data analytics—34% intend to speed up innovation through faster data delivery.
Data at the Core
“In an ideal data-driven enterprise, people make the right decisions quickly, based on evidence,” said Aaron Kalb, head of product at Alation. The challenge right now, he added, is that “decision makers wait endlessly for data to trickle in.”
Ultimately, the goal of such capabilities is to drive better customer experiences and more efficient processes. “This means in near real-time capturing of relevant data across different sources as it is generated, transforming it on-the-fly, making automated intelligent decisions based on it, and translating those decisions into actions—like customer interactions or work orders,” said Ben Hopkins, senior product manager at Pentaho.
The result is an insights-driven enterprise, as expressed by Jai Ganesh, vice president and head of Mphasis NEXT Labs. “Data is growing exponentially and this presents challenges for enterprises who want to monitor and mine these data sources,” he explained. “An ideal insight-driven enterprise would be one which leverages technologies and analytics techniques such as natural language processing, machine learning, adaptive and learning, algorithms, image analytics, vision-based sensing and image recognition, spatial and contextual awareness, reasoning and decision automation, pattern recognition, neural networks, semantic knowledge, robotic decision making, and emotional intelligence to engage real-time with enterprise data sources. Such an enterprise would be able to help decision makers by providing them with the right enterprise data metrics that will influence their strategy, products, services, and brand.”
Embracing a real-time, data-driven enterprise means taking a new approach to decision making, agreed Rich Fitchen, general manager for Bizagi North America. “Business leaders have to cultivate a mindset throughout the business to continually remove subjective bias from everyday thinking and rely solely on the data to determine what happens next. Successful data-driven enterprises leave competitors in the dust by using operational data alongside machine learning capabilities to intelligently assemble and trigger digital processes that focus on enabling employees to deliver better customer experiences.”
Real-time data delivery has a range of potential applications. For example, real-time capabilities can transform the buyer-seller relationship. “Companies and sales teams that are able to quote, propose, contract, and interact with their prospects in close to real-time with the exact content, pricing, and product requested will be the winners and continue to build momentum and market share over their laggard peers,” said David Kerr, CEO of Octiv. “Quoting, selling, and contracting technologies will reshape how buyers and sellers interact. Integrations, workflows, and collaboration solutions are subsets of those technologies that will redefine the efficiency and effectiveness of the selling and buying processes.”
Operational efficiency is another area that benefits from real-time data delivery. “The ideal data-driven enterprise is one that is set up to proactively and continually monitor data integrity, particularly in industries where the flow of data must be tied to the flow of products,” said Angela Fernandez, vice president and head of the national data quality program at GS1 US. “Industries, including retail, healthcare, and food service, leverage global standards in their supply chains to provide a common foundation for uniquely identifying products, capturing information about them, and sharing them with other companies as well as to consumers. Adoption of these standards and best practices can help eliminate manual processes that are susceptible to error, enable better data interoperability with other organizations, and increase speed-to-market by making data more actionable.”
Rise of New Technologies
Forces such as the Internet of Things, artificial intelligence, and machine learning are making a real-time, data-driven enterprise a reality, industry observers agree. “The explosion of technologies that can handle large volumes of data at low cost has really been a driving force in the big data revolution of the past few years,” said Leena Joshi, vice president of product marketing at Redis Labs. “In this continually evolving market, trends that make this data actionable in real-time will create the difference between winners and losers.” Capturing data in real time, powered by the Internet of Things, can be effective only with systems capable of handling large data volumes with very low latencies, Joshi noted. “Being able to implement adaptive applications powered by machine learning in real time is a critical aspiration for most enterprises, but real-time databases that can power such applications with built-in capabilities are most likely to make these aspirations a reality.”
These capabilities are increasingly supported by cloud computing as well. “The cloud undoubtedly has caused the biggest shift when it comes to data-driven enterprises,” said Bizagi’s Fitchen. “Enterprise cloud computing allows for significant storage of data and high compute power at scale, at a cost-effective price point. Using a cloud-first strategy, enterprises can make important steps forward in the digital efficiency and agility of their continually evolving business.”
Such a data-driven, real-time enterprise is tasked with carving clear, actionable insights from the data it collects. “Unfortunately, it’s often true that the sheer volume of data produced and gathered by modern enterprises can be so overwhelming that decision makers cannot make sense of it,” said Paul Hofmann, chief technology officer for SpaceTime Insights. “That is why data-driven enterprises employ emerging technologies such as machine learning algorithms and associate visualization tools, which can cull through the data to help organizations create actionable insights for important decision making. With these tools, companies can enable smarter, faster, and more informed decisions in real time.”
Every facet of the real-time enterprise may require different parts of the infrastructure. “Big data as a trend is old news—nearly everyone is dealing with the volume, variety, and velocity of data from a multiplicity of sources,” said Emma McGrattan, senior vice president of engineering at Actian. “It’s important to have the right tool for the right job, the best-fit database for different data types optimized for different access methods and service levels. It may take different data management technologies to manage transactional data, analytic queries, real-time streaming, and relationship graphs, but all those sources can enrich business decision making.”
The Internet of Things is also accelerating the move to real time. “There is a need for real time insights in order to take quick action, operational insights to make short term decisions and strategic insights, to enable business decisions, based on IoT data,” said Suraj Kumar, vice president and general manager of platform as a service at Axway. “Predictive analytics and machine learning are being used increasingly in the data driven enterprise to anticipate events and improve performance. Big data analytics enable modern data driven enterprises to gain actionable insights from vast data such as highly targeted marketing, uncover new revenue opportunities, and drive operational efficiency.”
Industry observers see a confluence between IoT and machine learning that will make the real-time enterprise a reality. “The IoT and ML are two key factors making the data-driven enterprise a reality,” said SpaceTime Insights’ Hofmann. “IoT exponentially increases the amount of data that a company can collect, especially in asset-rich industries like manufacturing and logistics. And machine learning functionality can be provisioned to sift through data to the information truly important to a company. But for the most success, machine learning needs to be performed in real time through technologies like IoT-enabled data sensors and edge computing systems that can analyze data closer to where it is generated.”
Other approaches helping to make a difference include the emerging practice of hybrid data management, “capable of handling both operational and analytic workloads, and hybrid integration platforms,” said Actian’s McGrattan. “With hybrid data management, analytics can occur closer to the transactional data that runs the business, while optimizing performance to deliver actionable insights.”
Developments further enabling the cost-effective delivery of real-time capabilities include open source platforms such as Apache Spark and Kafka. “Spark’s flexibility to handle different workloads, including micro-batch processing for close to real-time results on large data volumes as well as machine learning workloads, make it one of the key technologies in this new landscape,” said Pentaho’s Hopkins. “Kafka’s ability to reliably stream data in a distributed fashion at scale is key to fueling the closed-loop environments.”
In addition, there needs to be a way to effectively connect the devices in order to protect and manage the data flowing between systems, people, and IoT devices,” said Axway’s Kumar, who recommends deployment of application programming interfaces (APIs). “APIs allow organizations to securely expose in a standard format between applications, humans, connected devices to customers, go-to-market channels, and other applications in an IT infrastructure. APIs connect important ‘things’ like cars, medical devices, smart grids, and thermostats to your ecosystem, making flexible, scalable, and—above all—secure API management critical to the success of IoT.”
Are existing data technologies enough to achieve a real-time enterprise, or is more solution innovation needed? For many organizations with investments in legacy systems, it’s important to build on top of existing investments. “A complete overhaul will be cost-prohibitive for an enterprise,” said Mphasis NEXT Labs’ Ganesh. “The alternative is to invest in data extraction, analysis, and interfaces which will enable existing systems to leverage and interface with new business requirements.”
At the same time, the InterSystems-IDC survey finds a perception that legacy technologies and methodologies are holding organizations back from real-time data delivery. Organizations using standard data management techniques—particularly extract, transfer, and load (ETL) and changed-data capture (CDC)—are not keeping up with the demand for real-time data analysis, impacting business opportunities and efficiency. Both ETL and CDC are common across many enterprises today. For example, these technologies introduce a great deal of latency: The survey finds that close to two-thirds of data moved via ETL was at least 5 days old by the time it reached an analytics database. In addition, it takes, on average, 10 minutes or more to move most CDC data into an analytics database—which may be problematic for enterprises seeking to operate on split-second insights, such as extending a new offer to a customer while they are online or on the phone.
“Existing technologies often prove the primary obstacle to becoming a true data-driven enterprise at the speed required,” said Fitchen. “Legacy solutions are transactional in nature and lack the agility and flexibility required to support digital business operations. In the digital age, processes and the data that supports them no longer reside in isolated, monolithic systems. Instead business processes cut across systems and require abstracted data to connect people, devices, and applications on a global scale.”
The challenge of ETL and CDC-based infrastructures becomes more acute with the continuing rise of unstructured data into enterprises—IoT data, streaming data from external sources, sensor data, graphs, key value, video/audio/image, object, JSON documents, and geospatial data. As Ben Newton, machine data analytics lead of Sumo Logic, put it, “The technology being used for modern applications—microservices, serverless architectures, IoT—is driving up both the volume and variety of data, driving up the complexity of data collection and cost of analysis for modern applications, and overwhelming traditional data analytics tools.”
Skills for Real-Time Success
As with all groundbreaking technologies, enterprises will require individuals with the skill sets for building and deploying real-time, data-driven systems. Skills with such platforms as Apache Spark and Kafka are needed, as well as abilities to recognize business areas in which real-time will deliver value. “The most important skills will not be technical ones but business ones: the ability to comprehend the landscape of available technologies and data and to see the opportunities to combine those sources in new ways to drive business insights,” said McGrattan. “The most disruptive and successful companies over the past 2 decades have been ones like Google, Amazon, Netflix, and Facebook, who recognized the power of data and its relationships to build new business models to better serve the needs of their customers.”
Another challenge is getting enterprise culture to embrace a real-time, data-led approach. “Most business leaders are focusing on the business issues—the need to remain competitive, customer expectation for digitized services, and modernizing legacy systems,” says Fitchen. “However, it is equally important to prioritize cultural and mentality changes that embrace agility and empowerment,” he noted.
“You shouldn’t have to be a data scientist to work with active data,” agreed Peter Yared, co-founder and CTO of Sapho, adding it should be usable by people who use Excel or simple BI tools. “The point is to figure out what’s important right now from the small set of data that is currently changing. This requires more business insight than data chops.”
“The most important factor to success is an understanding of the use case and how more data and faster analysis can lead to better business results,” said McGrattan. For the most part, the technology exists, she added. However, it takes a clear business return to determine how much to invest in adopting that technology in order to improve current business practices and achieve the intended outcome.