Newsletters




With Rise of Big Data, IBM Puts Sharp Focus on Software-Defined Infrastructure


Image courtesy of Shutterstock 

As the industry moves into the era of computing that is characterized by big data – and the focus on cloud, analytics, mobile, social, security (CAMSS) applications – customers need to be able to capitalize on these technology advances. This creates a requirement for software-defined environments that can span both on-premises data centers as well as environments that are in a cloud data center, said Bernie Spang, VP, Strategy, software-defined environments in IBM Systems & Technology Group, in a recent conversation about IBM’s software-defined storage strategy 

Abstraction, Automation, Optimization

Software-defined is about three things, said Spang: abstraction, automation, and optimization. It is about abstracting the value - the function - in software, separate from specific integrated hardware implementation. Along with abstracting the value and creating it as a software layer, you need to have open interfaces to that software so that you can automate the infrastructure. And the third point is optimization, or analytics-driven optimization. If there is a programmatic interface to the software layer that is abstracted from the specific hardware underneath it, it is possible to automate it, monitor it, and apply analytics, and automate the optimization of the infrastructure.

When dealing with big data, and the data volume that is involved in storage, there is a need to optimize the compute and storage resources, and to optimize their alignment to do high performance analytics on the data, making openness, optimization, and the programmatic optimization of the infrastructure even more important, Spang noted. “We can’t do it with the traditional, manual, rigid IT environment of the past.”

This is quite different, Spang noted, from the traditional approach where the function of the compute system, the storage system, the networking devices, was only a diffusion of the capabilities in the hardware and the software – and proprietary interfaces therefore required expertise on each one of the specific devices, as well as manually-managed optimization of those devices.

The Rise of Automation

“In that world, you need experts deploying them, managing them, optimizing them,” he said. However, in a software-defined infrastructure environment, experts can set up the patterns that should be deployed automatically, including the attributes and policies for managing the environment automatically.

Spang suggests thinking of moving from the days of physical infrastructure to software-defined infrastructure to the move from cottage industries to the automated industrial revolution. In that transformation, manufacturing went from expert craftsmen in their small businesses that had to be involved in every aspect of production - and the associated local placement, high price, and low volume  - with that model, to an automated industrial model that boosted production and globalization.

As part of this emphasis on software-defined infrastructure, IBM offers Platform Computing software – a software-defined compute capability for managing computing clusters - bare metal and/or virtual machines to optimize workloads, coupled with the Elastic Storage  software-defined storage (formerly known as General Parallel File System,  or GPFS).  The Platform Computing and Elastic Storage work hand in hand to optimize the virtual compute and storage environment, he noted. IBM has also SoftLayer in its cloud which itself is a software-defined infrastructure in the cloud.  

In addition, IBM continues to offer the traditional virtual capabilities in the PowerVM on Power Systems, z/VM on System z, and storage virtualization with SAN Volume Controller, he said. For customers that have traditional workloads that make sense to continue running in a SAN storage environment, by using SAN Volume Controller, and by virtualizing that environment, they are able to increase the utilization of their storage and therefore, the cost-efficiency, he said.

Need for Resource Optimization

But increasingly, with the volume of data organizations are dealing with today, there is a need to optimize the compute and storage resources, as well as optimize their alignment in order to do high performance analytics. This makes the openness, automation and programmatic optimization of the software-defined infrastructure even more important.“You will be hearing more from us on that in 2015,” said Spang.


Sponsors