▼ Scroll to Site ▼

Newsletters




Designing For Speed & Scalability at Data Summit 2019


There are new technologies that contribute to speed and scale of a modern data platform. But as data size and complexity increase with Big Data, data quality and data integration issues must still be addressed.

Similar to many other mission-critical data management situations, clinical trials are fraught with missteps and data quality issues.

At Data Summit 2019 Prakriteswar Santikary, VP & Global Chief Data Officer, ERT discussed how to create the architecture of a modern, cloud-based, real-time data integration and analytics platform that ingests any type of clinical data (structured, unstructured, binary, lab values, etc.) at scale from any data sources, during his presentation, “Designing a Fast, Scalable Data Platform.”

The clinical trial industry is facing challenges such as data collection from multiple sources, devices, data types, etc; data processing and access from real-time data processing, data silos, lack of visibility, etc; oversight and monitoring; and data quality.

“These things really make the clinical trial industry so complex,” Santikary said.

Clinical trial teams are drowning in data. These teams are carrying a high administrative burden to extract and re-enter data into other systems. It’s difficult to filter out noise to identify real issues that require attention and more effort is spent confirming the accuracy of data than interpreting it for decision-making.

ERT’s modern architecture platform puts high quality data in the hands of people who need it. It can transform data into strategic business assets using a modern, secure, cloud-based data platform, Santikary explained.

“We are transforming data into strategic business assets,” Santikary said. “It enables our customers to do real-time decisioning and ingest data at scale.”

To unleash the potential of data, use master data management, data quality, data profiling, data policies and standards to foster cross-organizational collaboration.

A modern data foundation requires the ability to ingest any data of any type of any velocity, do data processing at scale with an API-first design that includes self service, real time and batch, along with advanced analytics capabilities including AI and ML.

“You need a platform that enables you to play with your data quickly,” Santikary said.

This Data Summit 2019 presentation is available for review at http://www.dbta.com/DataSummit/2019/Presentations.aspx.


Sponsors