Newsletters




IOUG Insight: Can High-Tech Really Be Completely Automated?


Much consternation is expressed these days on conference calls and in convention center hallways among technology professionals who are worried that automation in high tech will push them out of their jobs. Taglines such as “Automatic upgrades, automatic patching, and self-tuning eliminate human labor” make that fear not completely unfounded. But in real-world scenarios, there is a big difference between Utopian visions of the humanless data center and a realistic view of automation in information technology.

Our team was recently working with a longtime Oracle customer on its Exadata quarterly patching. The amount of effort and time that Oracle customers put into preparation for each patching cycle is not insignificant. Similar to most Exadata customers, the client uses Platinum Services to implement its Exadata patches.

The latest patching implementation did not go well. There were some errors during the patching that the support technicians either missed or ignored, and subsequently had to be fixed by the customer after the patching was complete. The problems with patching from Platinum Services got so bad with the client that the organization decided to no longer use the services for its patching cycle.

During the conversation that led up to the client’s decision to take over the quarterly patching, this question was posed: “Are the problems we are having caused by too much reliance on automation or humans?”

Planning for the Unexpected

This may help to explain why there is so much focus on machine learning and AI. Scripts written with a narrowly defined set of rules will break and often exacerbate problems caused by anomalies if programming is not able to recognize those specific patterns. Without the ability for an automated process to “expect” something to go wrong, automation can become an enemy of productivity.

Our team recently deployed two different hyper-converged data protection solutions in Amazon Web Services (AWS). Both have a version of their products specifically designed for deployment in AWS and Microsoft Azure. A key benefit of hyper-convergence is that the software is highly integrated with and tuned to the platform on which it runs. The downside of the hyper-converged methodology becomes pretty clear when one or more of the assumptions of the platform isn’t met.

The deployment of both products is highly automated and takes advantage of the service integration in both clouds. When we deployed the products, following as closely as possible the step-by-step instructions, the deployments broke somewhere in the middle. The lack of assumption that deployments may not run perfectly from start to finish in a foreign environment forced us to comb through logs to find the error.

We found the error and added a work-around that allowed us to finish that and many more successful deployments. A byproduct of our investigation into the logs revealed that the error we hit was not the only error in the deployment. In fact, there were myriad exceptions found that were mitigated automatically by the scripts. Moreover, the vendor had a process in place to capture the error and resolution, and quickly incorporate it into their next patch cycle.

This example would seem to make the case that automation may very well be reducing the need for humans to be involved in tasks executed by machines. However, rather than being eliminated completely, the need for human intervention has shifted, in this case to helping those machines learn what they did wrong so they can anticipate and resolve more anomalies the next time they execute the same task.

The Writing is on the Wall

The next time your YouTube video is interrupted by an enterprise technology company telling you that humans are expensive and we have the solutions to get rid of them, take it with a grain of salt, but dismiss it at your peril. The last 100 years of history can teach us a lot about the evolution of technology and the impact it has on different industries. Disruptive technologies such as electricity and the automobile absolutely eliminated the need for human beings—whether they worked in gas lamp manufacturing, as coachmen for horse-drawn carriages, or in other narrowly defined areas automated by those products. Every example not only shifted humans to different jobs but also created new additional opportunities that hadn’t existed before.

If this same effect can be expected from the automation of high tech, where do the humans go that are replaced by machines? For the savvy technology professional, the better question might be: What are those new opportunities that don’t exist yet and how do I best position myself to take advantage of them?


Sponsors