Why Quality Data Matters Now More Than Ever: Q&A with Syniti SVP Tyler Warden


In the quest for digital transformation and the use of data to become "data-driven," data quality is taking on even greater importance than it ever has before. Recently, Tyler Warden, senior vice president, product and engineering, Syniti, shared his insights on how organizations can improve the quality of their data and, as a result, enable more informed decision making.

What does Syniti provide?

Tyler Warden, SynitiSyniti helps enterprises deliver trusted data for better decision making, business operational improvements that save money, compliance that reduces risk, and agility to leapfrog the competition. Our mission is to help organizations see the value and power of trusted data to ignite growth and reduce risk. Our Syniti Knowledge Platform and consulting expertise empower organizations looking to level up their data game.

Syniti was previously known as BackOffice Associates. What is the significance of the name change in terms of the market you are addressing and what Syniti provides to customers?

When we set out to solve companies' data challenges during the late 1990s, our name reflected what we did as back-office support. Now, data is an even more powerful and strategic asset, and we wanted our name to reflect how data functions in the modern world. Our goal is to deliver trustworthy data that is understandable to all levels within an organization. As the world and data evolved together, it was time our brand evolved with it.

A lot of attention is being placed now on using data effectively and providing data insights to the right people at the right time. How accurate do you think the data is that companies are using?

Our research with HFS has revealed that only 5% of C-level executives have a high degree of confidence in the data they have. This lack of confidence in data is a result of years of c-suite leaders viewing data as an IT problem, rather than an asset to be leveraged. It wasn’t seen as a business problem until recently. Now the question is, does bad data hurt companies enough to do something about it? The window of long-term sustainable growth is closing on businesses who’ve yet to utilize their data on an advanced level.

How can companies assess the quality of their data?

Defining “good data” at a technology level is easy—defining how ready the data is to run and advance the business is another matter all together. I think any assessment should start with a technical assessment while at the same time a strategic down approach. Take a key KPI, just one, use that to scope the business processes and data that lean to that KPI and baseline the fit-for-purpose quality of that data. 

What is the problem with achieving high-quality data? What challenges are organizations facing in this regard?

There are four ways typically that poor quality data gets created: a human enters it incorrectly; a machine/interface/migrations creates or moves the bad data; a system makes it bad through poor usage or bad code; or businesses change so data that was once fit-for-purpose is no longer valid.

The challenges with achieving data quality is the balance of those 4 causes of poor data and addressing the right ones at the right time. The reality of limited resources (time, money, attention, etc) and making the most meaningful investment of those resources at the right time can be a hard decision to get right.

What can companies do to improve the quality of their data and use it more effectively? How do they get started?

The best thing an organization can do to improve the quality of their data is to choose an area of focus and just get started. If you’re unsure of where to start, start with the business process that you know is causing problems either in terms of waste or rework or frustration or a loss of revenue. From there, identify the key data elements used in that business process and start to wrap those data elements with the rules and policies that determine if the data is fit-for-purpose or causing issues.

You can also take advantage of other external events that have a major data impact such as a new system implementation or data migration. If a large initiative that involves data is about to start to just started, working quality into the foundation of that program can lift the overall data posture of the organization.

Ideally, working toward a data conscious culture through everyday leadership will result in a deeper, long-term benefit of actually understanding the important role that data plays in all levels of the organization, not just the IT department.

Once organizations get control of their data quality what do they need to maintain it?

The solution here is the classic people, process, and-technology triangle. Part of the key to an effective DataOps program is to have the people involved committed to and focused on bringing trusted data to the enterprise and to manage their performance and goals accordingly. The proper data KPIs will naturally lead into processes defined and designed to help achieve these KPIs. Finally, the right tooling is needed to support these processes and people. The right tooling will ensure both the collaboration and orchestration of DataOps processes but also take some of the load of the people through automation and intelligence.

Looking ahead, what are the advantages for companies that invest now in the quality of their data?

Investing in the quality of your data now will only increase your chances of staying ahead in the market while remaining compliant with a changing regulatory landscape. Investing in your data now allows your company to take bigger data-informed risks in innovation. It will empower your organization to make bold decisions faster out of a foundation of trust in the data and knowledge extracted from those raw data points. It will lessen the impact from changing regulations and public disclosure requirements. 

 



Newsletters

Subscribe to Big Data Quarterly E-Edition