The concept of delivering data as a product pervades many data strategies of organizations looking to optimize efficiency and positive business outcomes. This idea, however, maintains a level of elusiveness; while data products sound effective, how does an enterprise achieve this delivery and what approaches are necessary to accommodate its various assets?
Sue Laine, director of solution strategists, DI thought leader at Quest Software and Yetkin Ozkucur, professional services director at Quest Software joined DBTA’s webinar, Model to Marketplace: Tackling Data as a Product Delivery, to offer their expertise in data products and data cataloging, as well as how automation can be used to propel data product delivery.
Laine began by offering a recommendation from Gartner, asserting that, along with a federated analytics architecture, a product approach to the delivery of analytics artifacts is rather valuable. For enterprises taking a product approach to analytics delivery, these businesses “will find increased trust across domains and reduced redundancies in development,” according to Gartner.
While the value of adopting a data product approach is proved even by Gartner, what exactly is a data product?
Laine defined the terms as a product that facilitates an end goal using data whose primary stakeholders consist of data modelers, data scientists, data analysts, end data consumers, and decision makers. She further explained that there are three components that inform the meaning of data products, which include:
- Access, making data products available, discoverable, and reusable
- Value, or the business value derived from the data products to drive data valuation
- Ownership, taking note of who is responsible to maintain, monetize, and nurture the data product throughout its life
Product delivery, Laine explained, follows a particular workflow, where the business requests a data product from the data architect; the data architect models the data product and generates the code; and then the business provides the guardrails and promotes the marketplace.
Why leverage data modeling in data product delivery? According to Laine, a structured model gives “everyone a seat at the table and a common understanding of what this product is and what this product is going to look like.”
“It's that blueprint around the data and around that product of what you’re putting together,” Laine continued. Additionally, a structured model around data product delivery offers uniform definitions, descriptions, and sensitive data information to be used in data product curation.
On top of data modeling, automated data catalog/data intelligence capabilities can contribute to optimizing data product delivery, such as through:
- The generation of source-to-target data mappings and ETL code to gather data
- Automated generation of data lineage for data consumers
- Automated data profiling and data quality scoring to build data trust
- Integrated business context and governance guidance for consumers
Laine also argued that data marketplaces drive value for data products, ultimately ensuring that trusted data is easy to discover and centralized in one place; can point data users quickly to high-value data using value indicators; leverages automated workflows to facilitate data requests; addresses data sharing concerns, and more.
Ozkucur then led webinar viewers through a detailed demo of these various data product delivery processes, using Quest technology to facilitate its overall success.
For an in-depth discussion of data product delivery and approaches, you can view an archived version of the webinar here.