Earlier this year, Google became the first major tech giant to be hit with a General Data Protection Regulation (GDPR) fine—approximately $56.8 million. The stated reason: not giving users enough information about consent policies and sufficient control over how their personal data was being used.
GDPR has been in full effect since May 2018 and gives the European Union’s citizens stronger data protection rights over their personal data. These rules apply to all global organizations that possess EU customer data.
Under GDPR, companies are required to gain users’ genuine, express consent before collecting their personal information for any purpose beyond that for which it was originally derived. Consent procedures (either opting-in or out) are expected to be extremely clear, overt and understandable.
The GDPR also sets strict rules for how this data can be used once it is collected. Most people who provide consent understand they may be signing over permission for their data to be used in efforts like analytics and targeted advertising. But they may not be granting permission for their data to be used in highly technical processes that they likely don’t understand, like application testing—thus representing a landmine for consent missteps.
How Application Testing Can Pose a Threat to Compliance
Thorough application testing ensures that software meets functionality and usability requirements and delivers expected outcomes. Application development and testing teams often pull their test data from live production data. According to a recent report, 86% of businesses use live customer data for application testing. Using data derived from real customer data (often residing on mainframes) is considered preferable because testers believe this provides the most realistic assessment of how an application will perform “in the wild” for real people.
However, if an organization allows development and testing teams to use real customer data for testing purposes, they are exposing sensitive customer data and risking a significant data breach. In addition, if they do so without customers’ explicit consent, this represents a significant GDPR violation.
Most customers understand that signing over consent means their information may be used for things like targeted mailings and advertising, and even analytics. But unless these customers are software developers, application testing is likely a very foreign and highly technical concept that they’re not likely to fully understand.
GDPR is raising the bar for consent. Companies are now expected to explain more and be more transparent in describing how data will be used, all while keeping the language concise and not bundling simple consent with more complex use cases. This puts some organizations in a challenging position of having to succinctly explain a highly technical topic—in the online world where customers are fickle and impatient; and where more convenient, less confusing competitors are just a click away.
Application testing therefore represents one discipline where it is likely easier to remove the obstacle of securing consent in the first place. There are tools and techniques available to do this, like data masking. Masking the data, or desensitizing the data, means that any personally identifiable information is no longer visible and at risk. When done properly, data masking ensures that the data remains realistic and keeps the original format of the data, thus ensuring that the application is being tested with data that is as “real” as possible. The beauty of this process is that as long as it is reasonably difficult to identify an individual, GDPR compliance is met without having to secure consent.