We all know that monitoring data and insights are key to successful enterprise growth—but how can it be made actionable by operations teams?
DBTA recently held a webinar with Collibra titled, “From Dev to Data—Embracing Observability into DataOps,” featuring speakers Ankur Gupta, director of product marketing at Collibra and Eric Gerstner, data quality principal at Collibra, who discussed the relevance of actionable data through data monitoring and observability.
Poor data quality is typically treated rather than prevented—meaning, issues surrounding data quality are only resolved one it’s already done its damage. Employing data observability practices, rooted in IT observability, can correct data quality within its production state, not after the fact, according to Gerstner.
Focused on the impact of production grade data on its consumers, data observability targets data downtime—the period of time when your data is partial, erroneous, missing, or inaccurate—to prevent data quality issues before it impacts business decisions. With the implementation of a data operations team, or DataOps as Gerstner called it, enterprises can fight against data downtime through strategies similar to DevOps and ITOps: DataOps represents the intersection of CI/CD and an agile methodology to scale a software development lifecycle (SDLC) in alignment with business data goals. According to Gerstner, this is called “the data development lifecycle,” and incorporates personas like data operators and data engineers to eradicate data downtime during data development.
As data generation grows and so do its challenges, enterprises will need to focus on data observability to enhance the quality of data for improved business decisions; using IT observability as a guide, the enterprise data landscape necessitates preventive measures against poor data quality to boost its long-term business efficacy.
Gerstner further explained the details and relevancy of observability in DataOps, which can be viewed in an archived version of the webinar, here.