Newsletters




Data Quality and MDM Programs Must Evolve to Meet Complex New Challenges

<< back Page 4 of 4

Though the term “big data” is still debated, it represents something qualitatively new. Big data does not just mean the explosion of transactional data driven by the widespread use of sensors and other data-generating devices. It also refers to the desire and ability to extract analytic value from new data types such as video and audio. And it refers to the trend toward capturing huge amounts of data produced by the internet, mobile devices, and social media.

The availability of more data, new types of data, and data from a wider array of sources has had a major impact on data analysis and business intelligence. In the past, people would identify a problem they wanted to solve and then gather and analyze the data needed to solve that problem. With big data, that work flow is reversed. Companies are realizing that they have access to huge amounts of new data—tweets, for example—and are working to determine how to extract value from that data, reversing the usual process.

Data quality programs will have to evolve to meet these new challenges. Perhaps the first step will be methods for developing appropriate metadata. In general, big data is complex, messy, and can come from a variety of different sources, so good metadata is essential. Data classification, efficient data integration, and the establishment of standards and data governance will also be critical elements of data quality programs that encompass big data elements.

Ensuring data quality has been a serious challenge in many organizations. Frequently, data quality problems are masked. Business processes seem to be working well enough, and it is hard to determine beforehand what the return on investment in a data quality program would be. In addition, in many organizations, nobody seems to “own” responsibility for the overall quality of corporate data. People are responsible or are sensitive to their own slice of the data pie but are not concerned with the overall pie itself.

What’s Ahead

It should not be a surprise that in a recent survey of data quality professionals, two-thirds of the respondents felt the data quality programs in their organizations were only “OK”—that is, some goals were met or poor. On the brighter side, however, 70% indicated that the company’s management felt data and information were important corporate assets and recognized the value of improving its quality. On balance, however, data quality must be improved. In another survey, 61% of IT and business professionals said they lacked confidence in their company data.

During the next several years, data quality professionals will face a series of complex challenges. Perhaps the most immediate is to be able to view data quality issues within their organizations holistically. Data generated by one division—marketing, let’s say—may be consumed by another—manufacturing, perhaps. Data quality professionals need to be able to respond to the needs of both.

Secondly, data quality professionals must develop tools, processes, and procedures to manage big data. Since a lot of big data is also real-time data, data quality must become a real-time process integrated into the enterprise information ecosystem. And finally, and perhaps most importantly, data quality professionals will have to set priorities. Nobody can do everything at once.


About the author

Elliot King has reported on IT for 30 years. He is the chair of the Department of Communication at Loyola University Maryland,where he is a founder of an M.A. program in Emerging Media. He has written six books and hundreds ?of articles about new technologies. Follow him on Twitter @joyofjournalism. He blogs at emergingmedia360.org.

<< back Page 4 of 4

Related Articles

In today's business landscape, organizations are increasingly focusing on improving the customer experience to ensure that they're staying with, or ahead of, the competition. It's widely understood that in order to improve the customer experience, it's imperative that organizations understand the customer and tailor their services or products to each demographic and customer segment. However, two major developments are bringing about a marked change to this tried-and-true customer experience strategy: the proliferation of big data and the shrinking size of customer segments.

Posted January 20, 2014

Survivorship, known as the Golden Record in data terms, allows for the creation of a single, accurate and complete version of a customer record. A new technique for Golden Record selection offers a much more effective and logical approach when it comes to record survivorship. The most powerful future for data quality lies in the new and unique ability to discern contact data quality information and select the surviving record based on the level of quality of the information provided.

Posted November 06, 2013

Lectures related to master data bring forth all sorts of taxonomies intended to help clarify master data and its place within an organization. Sliding scales may be presented: at the top, not master data; at the bottom, very much master data; in the middle, increasing degrees of "master data-ness." For the longest of times everyone thought metadata was confusing enough ... oops, we've done it again. And, we have accomplished the establishment of this master data semantic monster in quite a grand fashion.

Posted February 15, 2012

Sponsors