Newsletters




Meet the Data Privacy Challenge: Creating a Culture of Responsibility


Google’s stiff fine for non-compliance with the EU’s General Data Protection Regulation (GDPR) has demonstrated the potential impact on a company’s bottom line in a way that has made GDPR a topic of discussion in many corporate boardrooms. Yet, a recent study by IT Governance revealed that only 29% of European-based organizations are GDPR-compliant. It has been speculated that the situation is even worse in the U.S., where the passage of the California Consumer Privacy Act (CCPA), along with proposed new federal privacy standards, means that organizations will continue to face a rapidly evolving and increasingly complex regulatory compliance landscape with ever higher stakes.

What’s needed is a dynamic governance environment that allows an organization to constantly adapt to both evolving regulations and the rapidly changing data infrastructure. However, the only way to achieve this environment is to create a foundational “culture of responsibility.”

Why It Matters

While the actual and potential fines related to GDPR and the CCPA are generating the headlines, there is more to the data privacy challenge than just meeting specific regulatory requirements. The amount of personal information that companies collect continues to skyrocket with data lakes growing to petabytes in scale. Further, thanks to abuses of personal information at Facebook and other companies, consumers are taking an interest in how the companies they deal with treat their information. This means companies now need to consider the impact on reputation and consumer confidence of their data governance strategies and effectiveness.

However, even well-meaning companies are struggling to govern their ever-burgeoning data stores. They can’t accurately identify what data falls under which privacy regulations. They can’t ensure appropriate usage consistently across their various data platforms. And, most lack the ability to audit their internal data processes and demonstrate compliance to regulators.

On a more tactical level, a fundamental friction exists between those charged with protecting data and those who need to use the data for business purposes. An expanding range of business users sees data as their resource and want fast, self-service access to it, and they often see data stewardship as an obstacle to ease-of-access. Meanwhile, data stewards and platform teams see their primary responsibility as securing the data and supporting regulatory compliance. A successful governance strategy must bridge this divide.

A Culture of Responsibility

Such a governance strategy requires a fundamental shift in how organizations think about data governance and privacy. Consider the different attitudes toward privacy expressed by Google/?Facebook versus Apple. At companies such as Google and Facebook, collected personal information is a product, and the revenue to be generated by that product will, at a minimum, inevitably lead employees to blur the line between uses that have been clearly authorized through customer consent and those that haven’t. Meanwhile Apple’s Tim Cook, in a recent speech at a privacy conference in Brussels, acknowledged that protecting personal information is critical for our society. After praising the EU’s implementation of GDPR, he said, “It is time for the rest of the world ... to follow your lead. We at Apple are in full support of a comprehensive federal privacy law in the United States.” He also addressed the complaint that such regulation is a barrier to innovation, saying, “This notion isn’t just wrong, it’s destructive.”

Accepting that privacy regulations are not a barrier to innovation is the seed that needs to grow into a culture of responsibility.

As with the shaping of all corporate cultures, creating a culture of responsibility requires a vision actualized by people, processes, and technology. Leaving out any one of these elements will doom the initiative to failure.

Vision

The vision underlying the culture of responsibility is that managing data in a way that protects personally identifiable information and enables regulatory compliance is a corporate asset, not an operational burden. Companies that successfully govern their data benefit in many ways, including better business decisions, greater agility, increased consumer confidence, and reduced costs associated with, for example, regulatory fines, legal discovery, data storage, and low employee productivity.

Most importantly, this vision must be embraced as something that makes the company better and stronger, that is critical to reaching the company’s strategic goals. It can’t be seen as a burden, an obstacle, a joke, or an afterthought. The only way this can occur is if top executives, including, potentially, a chief privacy officer (CPO), are frequently seen making this embrace.

People

Once company executives describe their vision for a new strategic initiative, they often assume the people working for them will automatically put in place the mechanisms to make that vision a reality. However, the “people problem” may well be the most difficult challenge to overcome when attempting to change the corporate culture. No matter what technology and processes are instituted, employees will have the ability to develop work-arounds, create new shadow IT, lie, cheat, or even inadvertently find a way around a proper procedure.

Preventing—or at least minimizing—this starts with executive accountability. It simply isn’t enough for executives to “support” a policy. They must take an active role in ensuring the implementation of that policy and hold themselves—and their peers—accountable when the policy isn’t implemented correctly or in a timely way.

When it comes to the culture of responsibility, it’s also essential for executives to understand that we are no longer talking simply about checking off boxes on a regulatory compliance cheat sheet. Privacy is an ethical responsibility. All employees should understand that they are custodians of customer data, and just as they would want their private information protected at other companies, it is their job (not somebody else’s) to protect their customers’ data. It must also be made clear that when this ethical responsibility comes up against revenue potential, ethics must win. Representatives of public companies often discuss their fiduciary responsibility to maximize profits for their shareholders, but it is time they recognize that today, as demonstrated by Google and Facebook, abusing private information will ultimately have a negative impact on profits.

Operationalizing this attitude may not be easy in many organizations. For many companies, it will require the addition of a CPO, who will have the power and focus to create the culture of privacy. It will also require ongoing training and regular reinforcement. Creating a document or presentation that announces the vision and lists the commitments simply isn’t enough. The culture of responsibility must be reinforced at every level of the organization.

Processes

To create a culture of responsibility, privacy and auditability must be designed into every step of every process lifecycle. And existing processes may need to be redesigned to incorporate privacy at every step. Whether it’s customer onboarding, product or service design, or customer upsell programs, process designers need to take responsibility for what is happening with the data. Just asking people to be better isn’t going to suffice. The systems and processes in the organization need to enable and incentivize the right behavior. For example, designing distributed stewardship that’s scalable and agile will enable the broader organization to work toward the agreed upon goals. Meanwhile, centralizing all stewardship responsibilities will quickly create a choke point and incentivize people to find workarounds.

Technology

In the era of big data, with data increasingly stored in multiple clouds and on IoT edge devices, securing and governing data in a scalable and agile way is technically more difficult than ever. As of yet, there is no standard way that security and governance capabilities fit into the big data journey, even as a growing array of data users—including data engineers, data scientists, and business analysts—are demanding fast, self-service access to the data in massive data lakes using a variety of analytics and machine learning tools.

The result has been the creation of fragmented solutions that make copies of data and otherwise increase the risk of inconsistent compliance. To support a culture of responsibility, organizations must adopt a governance solution that adheres to the following core tenets. It must be data-centric and independent of the applications being used to access the data. It must provide the required level of access control for both structured and unstructured data. And it must automate enforcement. Such a solution should also take a holistic approach to visibility into the data, providing fast, simple insight into both past access patterns, the data any particular person can access, and who has access to a particular data asset. Finally, an effective solution must future-proof the organization to handle the frequent and significant regulatory and technology changes it will face over the coming years.

Failure Will Be Costly

The ability to comply consistently with evolving privacy regulations is now a core competency for every organization. Those that approach developing this competence from an ethical perspective, creating a culture of responsibility that empowers every employee to support the compliance goal, will ultimately achieve that goal faster and at lower cost. Most important, they will future-proof their organizations against the inevitable regulatory changes while enabling agility and innovation without the fear of compromising the ethical necessity to protect their customers’ and employees’ private information. 

For more articles like this, check out the Cyber Security Sourcebook here.


Sponsors