Many data teams struggle to prove the business impact of their work. Traditional metrics such as uptime or throughput don't resonate with executives, making it hard to justify investments in modernization and tooling.
DBTA recently held a webinar, DataOps in 2026: Connecting Data Orchestration to Business Outcomes for Proving ROI, with Ashley Kuhlwilm, Sr. product marketing manager at Astronomer, who discussed a practical framework for measuring and communicating the business value of DataOps.
Data underpins the most critical digital initiatives in every organization, she explained. Success depends on getting the right data to the right place at the right time.
But today’s reality is entwined in data flow chaos and slow and brittle systems. There are fragmented workflows, increasingly unreliable data, and observability gaps.
According to Gartner, “By 2026, a data engineering team guided by DataOps practices and tools will be 10 times more productive than teams that do not use DataOps.”
DataOps tools specifically focus on the end-to-end flow of data, Kuhlwilm said, and orchestration is the foundation of DataOps. Benefits include:
- Unified complex data estates: Manage sophisticated workflows that integrate data from any source in any location.
- Enable cross-team collaboration: Express workflows through modern developer tooling, unlocking agility and time to value.
- Simplified tech stacks: Collapsing disparate data tools into a unified stack with integrated build, test, release cycles.
- Governance through visibility: Centralized workflow observability elevates data quality and trust, aligned to business value.
Leaders make trusted data readily accessible with a unified orchestration strategy. Getting there requires a culture and technology change, she said. Steps include:
- Align with business priorities
- Reframe value delivered to the business
- Treat data as a product
- Adopt a unified DataOps platform
- Expert services and best practices accelerate adoption and scale
A ROI measurement roadmap should consist of:
- Identifying where you are on the data velocity curve
- Calculating your baseline
- Setting stage appropriate targets
- Communicating value in business terms
According to Kuhlwilm, Astronomer offers Astro—a unified DataOps platform to build, run, and observe Apache Airflow with enterprise-grade performance at scale.
For the full webinar, featuring a more in-depth demo, discussion, Q&A, and more, you can view an archived version of the webinar here.