A lot of digital ink has been spilled over the hottest topic in data integration today: the data mesh. But is the data mesh truly achievable for a modern enterprise? And is it truly novel, or simply a rehashing of older paradigms and techniques?
DBTA held a webinar with Paul Lacey, Sr. director of product marketing, Matillion who discussed what a decentralized approach to data management could look like in practice.
The data mesh exists to overcome common challenges with centralized data infrastructure, he explained. It is a new approach based on a modern, distributed architecture for analytical data management.
Data mesh can reduce the amount of downtime in data and can accelerate the adoption of ML, AI and other kinds of modern analytics.
Data mesh can solve several issues such as challenges in creating a single source of truth, brittle ETL pipelines, and unresponsive centralized analytics teams.
A data mesh is governed by four core tenants: domain ownership, data as a product, infrastructure abstraction, and distributed governance.
There are two types of data mesh, Lacey said. It can be virtualized or physical. A physical data mesh needs:
- Accessible transformation
- Multi-modal data pipelines: Batch or Change Data Capture (CDC)
- Output connectors
- Strong data cataloging
- Strong policy enforcement
- Version control of data product schemas
- Strong data culture
Matillion can help create a data mesh, he noted. Approachable interfaces and visual workflows get more people across the business working with live data.
“Accessible interfaces are essential for success,” Lacey said.
An archived on-demand replay of this webinar is available here.