<< back Page 2 of 2

A New Age: AI and Machine Learning Meet the Cloud


When combining containers with a cloud and an orchestration platform such as Docker or Kubernetes, the result is a powerful set of capabilities. An orchestration tool allows for the automatic deployment of containers while providing an innately scalable system in an automated manner. These tools allow for the logical grouping of containers creating an easily manageable applications environment.

Artificial Intelligence and Machine Learning

The combination of AI with machine learning and the more modern idea of deep learning is paving the way for companies to go from reactive to proactive—and even predictive. Artificial intelligence is the ability of a computer system to perform tasks that normally required human interaction. Machine/deep learning is the ability of a software system infused with copious amounts of data to predict outcomes without being programmed specifically to do so. Based on past actions and outcomes, the computer can make a calculated decision on what to do next. Self-driving cars are possible due to the combination of AI and machine/deep learning. Imagine the possibility—some time in the dangerously near future—of a cybersecurity system that learns from previous attacks, a marketing system that learns from previous purchases, a hypervisor that transitions virtual machines across the cloud in a predictable or unpredictable manner to maintain optimal performance or migrates a VM off a host that is about to encounter a hardware failure. In today’s world of connected clouds, data is effectively infinite, and, when analyzed with the immense processing power of today’s systems, the intelligent applications that will be created to harness this combination of technologies are logically limitless.

Blockchain

We have all heard of Bitcoin. Blockchain is the technology that makes Bitcoin possible. A blockchain is a secure automated ledger. When all parties reach a transaction consensus, the ledger automatically records the event and that ledger is now part of the DNA of the transaction object. This has been referred to as “a transfer of trust in a trustless world.” There are many practical applications for this technology. When efficiently applied, this technology could be used to track our food supply, track a manufacturing process, plan for the production of needed medications, or anything else that can be conceived in the future.

More Is Better

Why settle for four processing cores when you can have six, or, better yet, how about 1,000? Just as we saw the hardware vendors create next-generation servers optimized for hypervisors like vSphere, we will continue to see huge advancements in GPU-acceleration. GPU-accelerated computing is the use of the graphics processors to accelerate the ability of a computer to handle artificial intelligence and other compute-intensive workloads. This critical technology is needed to handle self-driving cars and other compute-intensive workloads. In 2007, NVIDIA pioneered the use of graphics processors to offload work from the CPU. We will continue to see this capability become more powerful and fit in an even smaller footprint.

Universal Translators

Today, we go to a website and we can click the button for a translation of the page text. We can use our smartphone and ask it to translate a single word or an entire sentence. With the ability of technology to pack abundant amounts of processing power into a compact footprint, the capacity for an application to be adaptive and learn will evolve beyond our imagination.

For more articles on big data trends, download the Big Data Sourcebook.

It has been said that we are on the verge of universal translators of the spoken word. In fact, an interesting and instructive event occurred at the New York Giants opening football game a few weeks ago. A Japanese fan presented a ticket on a printout that was unfortunately discovered to be fraudulent when the fan tried to enter MetLife Stadium in the Meadowlands Sports Complex. The fan, who spoke not a word of English, was directed to the ticket booth to resolve his problem. This seemingly unsurmountable task was addressed by a truly intrepid ticket manager who also happened to be a local high school technologies instructor. The fan showed the fraudulent ticket to the ticket manager but she was unable to explain the situation to him so she pulled up a translation application on her smartphone and typed a message into the device which produced Japanese text. The fan also happened to have the same application on his phone so he typed his response in his native Japanese and a dialogue began which resulted in the fan purchasing a new valid ticket and experiencing the strange and violent world of American football. The fan was happy, the Giants sold another ticket, and we should all be aware that universal translators have arrived, much sooner than the 23rd century of Mr. Spock.

New Databases That Are Cloud-Ready

The relational database is 40 years old and only getting older. The mathematical model that serves as the intellectual basis of the RDBMS, relational calculus, has existed forever and will continue to exist. However, there has been an emergence of new NoSQL database technologies which work on the basis that not all data models need to follow the rigorous rules of the relational model. To succeed in today’s cloud world, each of these models and technologies are employed where appropriate. Interoperability and compatibility drive aggregate effectiveness and efficiency. Will classic database technologies support containers as the new data management technologies do? Will the newly designed applications keep the data consistent as pieces of the database are scattered across the cloud? The relational model guarantees atomicity, consistency, integrity, and durability (ACID) but is that level of rigor necessary for a particular set of data? Innovative thought-provoking approaches called “data persona analytics” challenge the necessity of applying the relational model to most datasets. It will be the responsibility of the owner of the data to determine how best to model the dataset to achieve optimal value of that data while maintaining the validity of the dataset. In the near-future, there will be new and innovative approaches to designing datasets but, most importantly, there will be new inference engines which will maximize the value of all data.

What’s Ahead

Just as it was with the Industrial Revolution, there exists a fear today that this new wave of technology will create a massive wave of unemployment. That fear is resurfacing because it is likely that a plethora of the 20th century tasks performed by humans which resulted in steady and consistent employment will be absorbed into the roles of ever-increasing numbers and types of robots. Remember Elbert Hubbard’s words and seek solace in them: “One machine can do the work of fifty ordinary men. No machine can do the work of one extraordinary man.” But, remember the words of Napoleon Bonaparte and proceed with caution. Consider the impact of the encroaching colossus of the technology industry. The value of the technology when applied to the improvement of life can be immeasurable, but the danger of reliance on a single, all-powerful individual, company, or industry can propel history in an unfavorable direction—even when the vehicle is driverless.

<< back Page 2 of 2


Newsletters

Subscribe to Big Data Quarterly E-Edition