Game-Changing Technologies Fueling the Data-Driven Enterprise in 2023

<< back Page 3 of 3


There is growing appeal for data fabric, an architecture that virtualizes and centralizes data from across enterprises into a unified data infrastructure. “As organizations look to harness the massive amounts of data permeating their businesses, they’re focusing on data strategies that center around connectivity, accessibility, integration, and enablement,” said Adam Glaser, SVP of engineering for Appian. “Unlike other data management innovations—such as data lakes or data warehouses—data fabric doesn’t require sophisticated engineering expertise. Its unique emphasis on the virtualization and centralized management of data eliminates the need for data migrations or APIs, allowing nontechnical personnel to use low-code tools to do the data modeling work themselves.”

Progress: While data fabric offerings are still relatively new, “they are very sophisticated and have delivered provable benefits to the enterprises using them,” said Glaser. “It’s an easy solution to integrate into an enterprise’s existing infrastructure. It’s designed to easily connect to a business’ data sources, wherever they reside. More importantly, it doesn’t require enterprises to change how or where they store their data.” In addition, vendors “have made major investments in point-and-click integrations of new data sources into the data fabric and low-code, visual tools for configuring capabilities like security and auditing.”

Potential roadblocks: The main roadblocks to the proliferation of data fabric “largely stem from the cultural side, not the infrastructure side,” said Glaser. “Enterprises have made significant investments in data lakes and warehouses, so shifting to yet another data management solution can be daunting. It’s not just about winning hearts and minds, it’s about changing patterns of behavior. The shift to data fabric will not happen overnight.”

Business benefits: Data fabric means “less drag on the enterprise to build and maintain data lakes and warehouses—with their significant operational overhead,” said Glaser. Data fabric enables organizations “to rapidly discover, unify, secure, and optimize data—regardless of where it resides—in ways that previously weren’t possible without complex data engineering and constant, timely maintenance,” Glaser explained. “With a single, unified architecture for business data, enterprises benefit from earlier discovery of data for application developers, allowing for faster development times for data-driven applications. Additionally, it makes development of business applications much easier, as developers can view and access enterprise data as the business thinks about it—as one cohesive concept, not individual pieces.”


Data quality has long been a concern of enterprises, and as attention has focused on data-driven initiatives, technologies that can assure peak data quality have moved front and center. “With the onset of generative AI and other AI processes becoming increasingly common, the volume of data generated is likely to increase exponentially,” said Sharad Varshney, CEO of OvalEdge. “This means it is becoming more difficult to identify, isolate, and operationalize quality data assets. Consequently, data quality tools, improvement processes, and features are increasingly important in 2023. Data quality improvement tools will develop greater capabilities in response to the proliferation of AI data.”

Progress: “The latest tools use AI to develop new ways to deal with unprecedented data volume,” said Varshney. “Using AI for data quality improvement, companies can collect data automatically using predetermined rules, spot duplicate data and anomalies, complete broken datasets, and undertake data validation, and all without the need for manual processes.”

Potential roadblocks: Low-quality data could impact the performance of tools, said Varshney. “AI needs good quality data to operate correctly and accurately. With data and AI so intrinsically linked, data quality really is the common denominator in conversations about both.”

Business benefits: Data quality improvement is the business benefit of data-driven innovation, said Varshney. “If data is of low quality, it’s simply useless. The list of positive outcomes when data is used as an innovation tool is endless. However, unless data is of high quality, there’s nothing you can do with it. And with the expected influx of data as a result of generative AI, data quality is more important than ever.”


Digital twins—in which organizations, systems, and facilities are digitally replicated—is dramatically impacting how data is managed and delivered. “As a digital replica of any asset, system, or process, this technology orchestrates and integrates data, allowing easy access for multiple parties/departments to review and analyze information in order to gain insights and make more informed decisions,” said Mike Campbell, chief product officer at Bentley Systems. “Historical and current data is used to create predictive models that provide an outlook on future performance. This can improve efficiency, optimize operations, provide ongoing maintenance recommendations, and more. Digital twins can be applied everywhere, from manufacturing and construction to supply chain and logistics management. Examples of digital twins currently in use include buildings, roads and bridges, electric grids, water networks, and even entire campuses and cities.”

Progress: Many industries, such as construction, engineering, and manufacturing, “are already realizing the benefits of digital twin technology, but there is still so much untapped potential for the technology,” said Campbell. “As the technology continues to evolve and become more accessible, it’s likely that we will see more widespread adoption and use in the coming years, particularly with critical infrastructure projects, as digital twins enable efficiencies across all lifecycle stages—from design and construction through operations and maintenance.”

Potential roadblocks: As with any new technology venture, it may be difficult to identify compelling use cases and garner the right organizational resources, said Campbell.

Business benefits: Digital twin technology “streamlines processes, fosters collaboration, and optimizes operations across the board,” said Campbell. “Digital twins can provide insights to businesses that allow for new revenue streams, automation of select processes, enhanced skill sets, an increased return on investment, quality of deliverables, and much more. Additionally, digital twin technology can provide real-time feedback on an asset’s performance, allowing for continuous improvement and optimization throughout the entire lifecycle. This is really just the beginning for digital twin technology.”

<< back Page 3 of 3