Page 1 of 2 next >>

From the Mainframe Era to the Internet of Things—And What Lies Ahead With Edge Computing


Today, with a quick screen tap on a personal handheld processor, people seeking a ride can open a cloud-based app and connect with drivers who can pinpoint an exact pick-up location via GPS, arrange a dropoff, and make a payment without the need to physically exchange cash or card.  Who would have imagined this reality when just 40 years ago, before the internet was mainstream, the only computing interface was conducted on a green-screen “dumb terminal” limited to display and data entry? Nevertheless, this current moment is the reality for only a fleeting period of time before a new, transformative—and disruptive—technological era begins.

The migration from the mainframe era to the Internet of Things (IoT) has led to countless life- and industry-changing innovations. And, it is only a signal of what is to come with edge computing and beyond. However, just as with all advancements, it also brings risks and threats. Given how rapidly technology is being developed and adopted in critical mass, it is vitally important to global, corporate, and personal security to ensure the integrity of each development, device, and network, especially now when so much sensitive data is aggregated and exchanged on these platforms.

For risk management and cybersecurity professionals tasked with integrating and protecting the systems and data of the future, it is valuable to understand the evolution of computing and security in order to address tomorrow’s risks.

The Path to IoT

The journey from mainframe to IoT was driven by pioneering engineers, technicians, and risk and security experts who broke boundaries to create solutions and maximize efficiencies. Mainframe computing began revolutionizing the way work was done and how information was stored. Little by little, computers were taking on tasks that had always required human labor. Though the typical access was through a simple data entry and display terminal, the mainframe still allowed for smarter work. The “one brain” system was relatively secure with minimal exposure points—which meant that there was little concern about network safety and cybersecurity (a term yet to be coined). Data processing professionals could create and maintain a secure system for any company, regardless of size.

Enter the PC. The need to allow more applications for diverse user communities within a firm, combined with the emerging ability to create them, helped usher in the distributed computing era: an empowering time when individuals now had access to independent processing power with desktop PCs instead of mere dumb terminals connected to the mainframe. However, by enabling more and more people to manipulate data and information on a particular network, the distributed era landed at a significant and consequential new intersection: increased capabilities and complexity and the need for more security.

On the heels of the distributed era, the internet entered the mainstream and was integrated into homes and workplaces. This moment in time forever changed every aspect of how the world connected and worked. What started as a tool for government use quickly evolved to public and commercial use from dial-up to wireless connections and from desktops to smart phones. Every new point of connection brought with it new capabilities and new vulnerabilities. Cyber-risks, introduced during the distributed era, now escalated in seriousness with malware and denial-of-service attacks. New legislation was enacted to address threats, while security vendors designed products available to protect data.

And then came IoT. While the concept of smart devices had long been top-of-mind, they did not become widely accessible until around 2008. Today, just about anything with a plug can connect to a network and be used for precise purposes. IoT has brought us smart houses, turned mobile devices into personal medical monitoring tools, and streamlined countless aspects of every industry. These capabilities rely on the sharing of sensitive data, which increases the need to secure the many points of exposure.

What’s Next

IoT stands to further increase efficiencies as edge computing is integrated. Edge computing effectively moves computing power as close to the IoT device as possible. Not only will it create better performance and lower latency, it can also greatly increase capabilities. The number and impact of the revolutionary outputs we saw with 4G—services such as Uber, higher-definition television, and more—will likely be greatly exceeded with 5G and edge computing.

Ideas once thought futuristic, such as autonomous vehicles or vehicle-to-vehicle communication, will enter the mainstream with the intention of making roadways safer and reducing accidents. Edge computing will likely bring advancements in telemedicine that can break down geographic barriers to care and provide life-saving services to people, regardless of location.

Edge computing will also drive continued movement from AI and machine learning to deep machine learning—all while continuing to lower latency. This will enable even smarter use of technology to bring applications such as facial recognition to airports and to better secure infrastructure, including power grids, fuel lines, and roadways. But as we invest more and more in the development of these technologies—and trust their capabilities to keep us safe—it again becomes even more vital to protect them.

Page 1 of 2 next >>


Newsletters

Subscribe to Big Data Quarterly E-Edition