The Game-Changing Technologies Powering the Data-Driven Enterprise in 2021 and Beyond

Page 1 of 3 next >>

We’re still at the start of the 2020s, and already, things look very different from the preceding decade. For data executives and profession­als, the years ahead may mean change on a scale never seen before in the IT industry. Promising new technologies—as well as redesigned and repurposed older ones—are reshaping the data center and analytics shops in new and exciting ways. We asked industry leaders for their views on what is enhancing the ability of enterprises to compete on data.


A digital thread construct is a live, inte­grated repository and audit trail of real-time data generated across the end-to-end lifecycle of data entities such as products, employees, and customers. “While the concept of the digital thread is not new, we are just now starting to see widespread adoption and implementation take off,” said Andy Kopp, director of transforma­tion products at Lexmark.

For products, for example, this would include everything, from design through manufacturing and service to recycling, Kopp explained. “This holistic approach to data management often complements IoT-and cloud-enabled as-a-service strategies, providing the opportunity to expand closed-loop analytics across an entire value chain.” The benefits of digital threads include empowering supply chain teams to “evaluate what, where, and when to produce products based on real-world, real-time customer use trends” or helping product development teams “see downstream service implications from their design decisions.”


Data catalogs, which provide central­ized views, via metadata, of data assets across enterprises, are emerging as com­petitive tools used by data managers and business users alike. “By eliminating the time searching for and finding the right data across complex data ecosystems, data catalogs allow users to quickly find the right data to answer business ques­tions correctly,” said Kim Kaluba, senior manager of data management solutions at SAS.

In the years to come, “data catalogs will continue to mature and morph into the use of information catalogs,” Kaluba pre­dicted. “Organizations will seek to move beyond just cataloging data to being able to identify and catalog all important dig­ital assets in a single location. The rise of information catalogs will improve data understanding and allow adjustment, maintenance, creation, and governance of the most important digital assets impact­ing enterprises, as well as maximum visu­ality into the success or failure of these assets.”


Taking data catalogs even a step fur­ther, data intelligence software is a cate­gory that encompasses a range of tech­nologies and is having a profound impact on the capabilities data managers will deliver to their enterprises, especially in the areas of strategic planning, opera­tional excellence, and innovation.

“Think ‘Amazon’ for enterprise data,” said Danny Sandwell, director of product marketing for Erwin by Quest. “[It’s] a true consumer’s approach to intelligently showing what data you can have, under­standing what other like-minded data consumers have accessed, a shopping cart to specify your data needs, and built-in preparation capabilities to become a one-stop self-service shop for all your data needs, concerns, and ideation.”

Data intelligence software provides a flexible, automated framework that enables the identification, understanding, control of, and insights from your data estate, facilitating enterprise data coordi­nation, orchestration, and “trust in data” to ensure that enterprise data is discov­erable, accessible, understandable, highly available, and protected, said Sandwell.

The technology “delivers a single, con­solidated view into an organization’s end-to-end data capability. It combines data cataloging, a deep and curated technical view into data assets, processes, and tech­nologies, with data literacy, a framework of business-data vocabulary, policy, rules and classification, as well as automation that integrates, activates, and socializes these artifacts and their inter-relation­ships,” Sandwell noted, adding that this supports a more holistic and effective approach. “The net-net is an enterprise data capability that is well-aligned with business priorities, cost-effective, agile, and able to deliver better time-to-value on data-driven initiatives and use cases that will have the desired transformative impact the business is demanding.”


Increasingly, databases are being deployed in rapidly changing environ­ments to meet the growing volumes and types of data flowing in from all sources, internal and external. These scale-out databases can be rapidly added or swapped out to meet performance and capacity requirements. “Scale-out data­bases are also being widely adopted,” said Monte Zweben, CEO of Splice Machine. These databases can be custom-fit to spe­cific requirements and use cases. “There is now incredible specialization of data­bases, so knowing specifically what data will be used for is the best way to choose the right database,” Zweben noted.

These requirements may range from analytical queries with no real-time neces­sities to analytical queries with machine learning or real-time transactional que­ries based on high availability and short downtime. “Scale, access patterns, latency requirements, throughput, availability, and consistency are all important criteria to consider when determining data fit,” said Zweben.


AI and machine learning are key to many information technology initiatives. Their impact on data functions will be profound, especially if an enterprise seeks to have data-driven decision making. AI and machine learning enable “mun­dane processes to be intelligently auto­mated and can monetize business data,” said Margaret Lee, senior vice president and general manager of digital service and operations management for BMC Software.

“AI and ML are used to extract and utilize valuable data from traditional sources like records and new sources like Internet of Things devices, social media, and customer engagement systems. This technology integrates with automation tools, converts raw data into insights and actions, and trains models from the data pipeline,” Lee said. AI and machine learn­ing also “help to ensure data compliance with data quality best practices, protect privacy with governance tools, and can automate workflows for improved visibil­ity. On top of this, predictive analytics to ingest, store, process, collect, and analyze data can be leveraged.”


Automated machine learning (AutoML) has the potential to create a positive impact on data assets. “Just as DevOps and DevSec­Ops enabled a higher level of fluidity and business orientation in the application world, AutoML has the capability to bring that to the data world,” said Shriram Natarajan, director of digital strategy and solutions with ISG.

Page 1 of 3 next >>