Newsletters




Winning With a Modern Data Strategy at Data Summit Connect 2020


New approaches are available to modernize, streamline, and accelerate data management processes. It is critical to stay up-to-date with the latest ways to extract value from data with a modern data strategy.

At Data Summit Connect, a free 3-day series of data-focused webinars, Seth Earley, CEO, Earley Information Science, and Ralph Aloe, director, Enterprise Information Management, Prudential Financial, offered strategies for smarter data management in a session, titled "Winning With A Modern Data Strategy."

To watch Seth Earley's presentation at Data Summit Connect 2020, go here.

To watch Ralph Aloe's Data Summit presentation, go here.

In his presentation, titled "Preparing Your Data Infrastructure for Agility, Adaptability, and Emerging Technology," Earley, is the author of a new book, "The AI-Powered Enterprise: Harness the Power of Ontologies to Make Your Business Smarter, Faster and More Profitable," discussed approaches for bridging the gap between a modern application infrastructure and ensuring that quality data is available for that infrastructure.

Data is critical for input to strategic imperatives. The problem, said Earley, is that many organizations are finding that their data and infrastructure are not ready for newer applications such as conversational AI and predictive analytics.

When initiating data project, it is important that advances be measurable and linked to business value because leadership is often reluctant to make the necessary investments unless there is clearly demonstrable ROI, said Earley.

Demonstrating Benefit through the Process

To be successful, said Earley, data initiatives must be understandable, measurable, and linked to an outcome.

"Basing things on metrics is critically important because that is how you demonstrate ROI. If you don't think you can demonstrate ROI, I guarantee you that you can if you deconstruct the process and start looking at the inputs and the outputs," said Earley.

These can include customer behavior and feedback or employee behavior and feedback, process performance, total cost of service, and product performance. Customer behavior can include buying more or less, or abandoning shopping carts; customer feedback might include complaints and comments on social media; employee behavior can include tenure, and, again, customer feedback; and product performance might be the level of returns and quality.

A whole range of metrics can be defined for standard accountability aspects, said Earley.

All stakeholders will have their own specific perspective, tools, metrics, and priorities, depending on factors such as the person's role and industry, said Earley. It is important to have data and quality scorecards along the way to show how  processes are being enhanced, showing how well the data is supporting processes, leading to objectives in support of a specific outcome that has high business value.

Getting Data in Order

Earley also covered the importance of ontologies and knowledge graphs. An ontology can be thought of as the "knowledge scaffolding" that provides the organizing principles that can be overlaid on top of any system or data source to enable disparate systems to communicate. An ontology is fundamental to AI, enabling an understanding of industry and business terminology.

Ontologies define the structure of knowledge graphs, which link that structure to data to enable access and integration. Knowledge graphs are particularly valuable to organizations because they enable integration of structured and unstructured information, provide a mechanism to leverage and operational ontologies, and also reconcile inconsistencies in data and terminology.

In the end, getting and retaining leadership buy-in for data improvement initiatives will include the following steps:

  • Building a data remediation business case
  • Prioritizing where to focus efforts
  • Engaging the organization by translating objectives into tangible goals that align with business needs
  • Translating messages according to what is important to business stakeholders and how the results will be measured
  • Building metrics scorecards and dashboards to course correct, decide on interventions, and show progress

Earley was followed by Aloe, who discussed the modernization of his company's infrastructure using data virtualization. After more than 140 years of acquiring, processing and managing data across multiple business units and multiple technology platforms, Prudential wanted to establish an enterprise-wide data fabric architecture to allow data to be available where and when it's needed. Data virtualization has helped to democratize data within Prudential while providing robust security and governance, and is viewed as a critical component in Prudential’s journey to the cloud.

Aloe discussed why Prudential chose data virtualization technology to create a logical data fabric that spans its entire enterprise. In the presentation titled, "Accelerating Data Management at Prudential with Logical Data Fabric," Aloe said that at the outset, Prudential faced challenges in terms of governance procedures, which were not consistently adopted, and neither persistent across the organization nor comprehensive. The data architecture was fragmented and disconnected, there was no standard data operations platform, and ETL was everywhere with different technology used across various business units. Data quality was also an issue, he said.

Data virtualization, said Aloe, has turned out to be an easy way for the company to improve its data architecture and operations without having to remediate everything.

The benefits have included multiple areas, including:

  • Data governance: The organization has built out its metadata mastering capabilities across disparate data platforms
  • Data architecture: There is reduced complexity, and it is easier to do data discovery and access processes
  • Data operations: There is support for power users and there has been a "leapfrogging effect" for modernization, enabling access without re-platforming older systems
  • Data quality: There is now centralized enrichment and quality services, as well as plug and play on ingest and/or consumption.

Webcast replays of Data Summit Connect presentations are available on the DBTA website at www.dbta.com/DBTA-Downloads/WhitePapers.


Sponsors