Page 1 of 4 next >>

AI: Data Quality’s New Frontier

The commercials are amusing. A bunch of technicians are standing around a large airplane. One asks, “Who’s the new guy?” Another points to a monitor on a table and answers, “Watson,” IBM’s artificial intelligence (AI) technology. Watson then informs the group that data suggests replacing capacitor C4. The skeptic nods approvingly, and asks if Watson can help with the coffee maker.

The intensive commercial marketing for Watson is one of the most visible manifestations of the growing applications of AI, or at least the machine-learning approach to AI.  While not the only approach to the idea that machines can be built to “think” as humans do, machine learning currently represents the state of the art in the field. The conceptual roots for machine intelligence were planted more than a half century ago, when Arthur Samuel, a researcher at IBM, programmed a computer to play checkers and coined the term. Samuel’s insight was, rather than trying to teach computers about the world to enable them to complete discrete tasks, it would be possible to program computers to learn about the world and solve a wide number of problems.

For more articles on big data trends, access the BIG DATA SOURCEBOOK

Starting in the 1980s, Samuel’s approach was captured in a concept called neural networks. The goal of a neural network is to empower a computer to categorize data in the same way that a human mind categorizes data—for example, sorting images according to certain specific images. The approach then employs probability and feedback to improve results. It makes decisions based on the way it has classified data and receives feedback about the appropriateness of its decisions. The feedback allows it to refine the choices it makes, in essence “learning” to make better decisions. The learning process is known as training the system.

Emergence of Machine-Learning System

After a gestational period that has lasted a generation or more, machine learning systems such as Watson are beginning to emerge everywhere—in homes, your cars, and even in our pockets. The popular virtual digital assistants Siri, Alexa, and Google Now are powered by machine-learning technologies. The traffic predictions offered by GPS systems are “driven” by machine learning. Video surveillance, spam monitoring, suggestions for contacts on social media, and product recommendations are all machine-learning applications.

And the growth of machine learning in the business arena is even more robust. People are no longer surprised to be contacted by their credit card companies because a charge looks unusual. Online fraud detection is based on machine learning, and so is scoring of credit applications and mortgages. Machine learning is being deployed throughout healthcare in everything from drug discovery to disease diagnosis. Technology such as’s Einstein system is applying machine learning to every aspect of customer relationship management. And those are only a few examples.

Page 1 of 4 next >>


Subscribe to Big Data Quarterly E-Edition