Living on the Edge: Devices Evolve From Smart to Intelligent

The advent of social media, and its resultant data deluge, was facilitated by the cloud. Organizations added more storage in the cloud and did their analytics in the cloud; the cloud was the best and cheapest option.

In the IoT world, devices are moving from smart to intelligent due to the advent of artificial intelligence and machine learning. However, this is now hindered due to the natural limitations of the cloud. In order to progress, devices need lower latency, a higher level of independence, and better connectivity.

This challenges the paradigm of doing it all in the cloud. Currently, a smart device is defined by its ability to connect to a network to share and interact remotely; mobile phones were the de facto example.

As devices become more intelligent, we as consumers require more from them, partly because we want more convenience in our interactions with them. For example, we want them to fade into the background, similar to a thermostat acting as a loyal butler “sensing” your needs. And, it is partly because we want them to perform more tasks efficiently, reliably, and at a higher level of independence. After all, our self-driving cars must be able to operate under any conditions.

So what does this mean? It means we need to decentralize IoT compute power away from the cloud and onto the edge. Basically, intelligent devices will have to be able to support us independently in the real world—a real-time world needs real-time decisions. To avoid latency and disruptions in connectivity, and to avoid having the burden of the costs associated with the data transfer, it is of the utmost importance that automated control logic is done extremely close to the origination point of the data. The cloud, with its centralized paradigm, isn’t really a good solution for that.

Although advances have been made in throughput and latency, with huge increments still expected with the arrival of 5G and Time Sensitive Networks, the question will be: Can they outpace the amounts of data coming from the increase in new “things”?

Just think about it: Will cloud be able to handle the rise of the intensity of data traffic—the increase of datapoints, the types of data collected (what about raw video or sound feeds), and the frequency with which it is collected? Without becoming a bottleneck? Consider a diesel engine manufacturer collecting 250,000 events per second from a single engine, for example—or the 10GB of a data per mile a self-driving car creates.

And do we really have to transfer all data to the cloud for analysis? According to McKinsey, most IoT data is not used. For example, it says, less than 1% of data generated by an oil platform is used for decision-making purposes. Finding value with IoT analytics is often similar to finding a diamond in a mountain of rubble. We can accept that 1% of the data has value, but which 1% is it?

The answer to this question is simple curation. If it can be decided at the time of origination which data will be left on the edge and which will be passed on to the analytical systems in the clouds, much is won. However, in order to curate, advanced analytics are needed at the place where the data originates. This means that some of the analytical capabilities have to transfer from the cloud as close as possible to the device and even on it; as such, it would be the ideal situation if it could reside on the CPU on the “silicon.”

Although large innovations are still needed—especially in the area of in-memory compute—this is entirely possible with advances in compute power and storage, the decrease of the costs associated with these, and the adoption of advanced modern analytical paradigms such as streaming analytics.

Storage, especially, is something that is often overlooked, but it will be critical. Analytics are based on data, and datasets need a point where they can live, reside, and take protected shelter. So, to make this happen, storage is needed—fast and cheap storage that will distribute itself across in-memory and disk, through advanced caching mechanisms.

If this happens, the devices we currently call “smart” will in the future look ridiculously dumb (as most sci-fi movies do after 2 or more decades).

In that way, the innovations ahead of us might be similar to the inventions that lead from the 78 rpm record players to high fidelity audio instruments. Better amplifier designs, new loudspeaker designs, FM radio, with wider audio bandwidth; all these led to a marketing term that later became an industry term—Hi-Fi—that paved the way to adoption all over the world.

As Peter Levine said, for intelligent devices we need a Hi-Fi definition of compute, storage, networking, and analytics.

Hi-Fi IoT—how cool would that be?

Bart Schouw is IoT solutions director at Software AG. Based in the Netherlands, he has nearly 20 years of experience in IT in all areas.


Subscribe to Big Data Quarterly E-Edition