Business leaders have long realized that becoming data driven is a critical business imperative, not just for improved business outcomes, but in many cases, for their right to survive. Recently, Harvard Business Review published an article from Randy Bean, who is CEO and founder of NewVantage Partners and author of Fail Fast, Learn Faster: Lessons in Data-Driven Leadership in an Age of Disruption, Big Data, and AI. Bean suggested that for many organizations, this quest remains a work in progress. In fact, he referenced a NewVantage survey that found 90% of executives felt adapting their cultures is still a major challenge. On the positive side, cloud adoption enables faster analytics and is estimated to grow more than 24% CAGR over the coming years.
This search for more agile data and analytical processes has led to the emergence of DataOps frameworks that can improve an organization’s responsiveness to the business users who demand more information. Like its earlier cousin DevOps, the aim is to avoid very large and complex data projects that take months or years to deliver value by breaking it up into smaller projects that can be delivered in a short amount of time.
The foremost benefit for DataOps is agility but, at the same time, it lowers the risk of delivering projects that no longer match the current business requirement. Making data broadly available inside, and increasingly outside, this approach is becoming a powerful way for companies to drive value from data. In fact, according to Gartner, “Data and analytics leaders who share data externally generate three times more measurable economic benefit than those who do not … and are 1.7 times more effective at showing demonstrable, verifiable value to D&A stakeholders.”
Companies that embrace these modern data sharing approaches benefit, not just from the new business value they can create, but the speed with which they can embrace new analytical approaches, open new data sets, and modernize aging, on-premise enterprise data warehouse deployments. Digital transformation is a high priority for most organizations and modernizing traditional IT infrastructures offer tremendous cost savings for businesses.
DataOps’ Dual Mandate Comes Front and Center
However, as is the case with agile software development, going faster is great, but only if you can deliver the right quality. And, in the world of DataOps, the dual mandate of data democratization and the need for privacy, security, and compliance are pushed front and center. The speed and simplicity of cloud analytics adoption is opening a new world of risk for corporate data governance teams mandated with ensuring organizations remain compliant in an increasingly complex world of global regulations. In short, managing data and access governance in a traditional top-down, IT-controlled manner, while managing data, analytical, and visualization technologies via a decentralized, agile DataOps framework, is a recipe for disaster.
To unleash the true value of a DataOps program, stakeholders need an equally agile, yet strong, data access governance framework that facilitates shared analytical data and ensures data is governed and in compliance with centralized data governance programs. A governed data sharing approach empowers domain experts and analysts, not just with timely access to the data they need, but also a self-service ability to manage the policies and rules controlling access to the data for their respective teams.
Data Access Governance Framework—Taking the First Step
Establishing and embedding a data access governance framework into a DataOps program from the start is pivotal to ensuring that the enterprise remains compliant along every step. A major challenge the framework helps avoid is the bottleneck and misalignment of skills and mandates of a centralized, IT-driven governance model. On one hand, IT, who tends to own the policy creation and enforcement, knows the technologies they operate, but have no real knowledge or understanding of the data and business context. The business owners know the data and analytical context but lack the technology knowledge. A well-designed data access and security governance framework will provide IT a centralized, single-pane view of sensitive data and policy management, while enabling them to delegate the responsibility of sharing datasets or providing access to specific datasets to analysts or stewards located in the business community.
Another key consideration in a DataOps program is a unified or universal framework to manage data access and security governance across hybrid- or multi-cloud environments. The freedom and flexibility to deploy across multiple clouds and different data technologies offers great value but can add to the governance burden if not effectively automated. For instance, customer data might reside in AWS EMR and Snowflake in a private AWS instance, as well as files located in S3 buckets. Policies must be consistently defined and automatically applied across this diverse landscape to avoid unauthorized use of data, which can result in compliance violations or operational downtime.
Most organizations have already embarked on their journey for faster insights and data-driven business innovation and realize that they need to transform their data from traditional IT-centric centralized design principles into a DataOps approach. As traditional waterfall data projects are now giving way to this more agile method of data sharing, it’s imperative that enterprises consider an effective data access governance framework. This foundation will unify the discovery of sensitive data, management and enforcement of policies, masking and encryption, and auditing and reporting that is critical to enable rapid data onboarding and sharing. More importantly, it will unleash the true value of an organization’s DataOps program by ensuring data shared among the business remains managed, governed, and in compliance with existing centralized data governance programs.