Megatrend: Virtualization

Bookmark and Share

Three columns ago, I started a series of articles pointing out that tough times are a-comin' for the DBA profession due to major disruptive changes in the wider IT world (see "2012 Might Really Be the End of the World as We Know It"). In previous columns, I have told you about how our lives will change due to major technological changes caused by things such as  Solid State Disks (SSD) and massively multicore CPUs.

This time, I want to talk with you about virtualization. There's no doubt in my mind that virtualization is a megatrend. I speak at conferences a lot, and, by that, I mean averaging at least twice per month. One of the things I've heard from numerous data professionals is that they are implementing, will implement very soon, or have already implemented virtualization. And these are not small projects. The organizational executives want to see most, if not all, database instances running on virtualization.

Just a couple of years ago, databases required a special dispensation to even be considered for virtualization. Now, many shops have seen a complete inversion of that rule. In fact, the DBA team needs a special dispensation not to virtualize a new database instance. (For more information on virtualization, check out this introduction from the industry's leading vendor:

What's Driving Virtualization Adoption?

As with normal IT operations, it seems like 80% of new adopters of a technology are driven by cost savings. Virtualization delivers cost savings in spades, because database applications typically exert a light load on the CPU, usually in the 20-35% range. That means a single physical machine could support two or three more full virtual machines running a comparable load. That means we can stretch our thin IT budgets much farther than in the past. Virtualization offers other benefits, however.

The second biggest driver for virtualization, in my experience, is the powerful high availability features provided by most vendors. Both VMware and Microsoft (with their Hyper-V product) offer "live migration" features. For example, VMware's technology, called VMotion (detailed at, enables you to move a virtual machine (VM) from one physical host to another while the VM is still running. That means that a DBA who's held to a SLQ with 99.9% uptime could deal with intermittent hardware problems, while never taking the VM offline. By comparison, swapping out a finicky and problematic component on a physical machine, like some DRAM or a hard disk, would either shoot your SLA full of holes or force you to dramatically over-purchase hardware for "just in case" scenarios.

A third reason for adopting virtualization is a small, but significant, "green" savings. Power, cooling, and rack space consumption are all reduced in highly virtualized data centers.

It's Not All Roses

On the other hand, virtualization is not a panacea. In fact, virtualization can even introduce new challenges and obstacles where they didn't exist before. For example, the majority of DBAs are, at best, grudging tuners and troubleshooters. Since virtualization further abstracts and complicates the troubleshooting process, a virtualized database infrastructure can be harder to keep running smoothly. When serious problems occur, they can take longer and be harder to detect, diagnose, and resolve. Troubleshooting may be further complicated, in large enterprises, by political separations of the database and virtualization teams.

In addition, many executives forget that while a VM is virtual, it is still a server. And each new server translates into more time needed to install, configure, provision, and maintain. So, while adding VMs may not add to the size and complexity of a data center from where the CIO views it (in the physical world), the DBA now has 2x, 3x, or 5x more servers to worry about.

Devaluing a DBA?

Right now, talented DBAs are priceless for data-centric operations. You simply can't run a serious database application for long without one or more on hand. The virtualization tools vendors, in the long run, don't see eye-to-eye with that assessment. VMs aren't currently capable of instantly and elastically increasing their computing resources, as needed, by the database load running on them. But the virtualization vendors are hard at work to make that possible.

When you combine a hypervisor (capable of instant and elastic resource provisioning) with vastly increased hardware resources (those SSDs and massively multicore CPUs I discussed in the last couple of columns), you've got a recipe that dramatically reduces the need for highly tuned databases. So, while 2012 might not be the end of the world for DBAs, it's certainly the beginning of a new and uncertain period of change.

Kevin Kline is the technical strategy manager for SQL Server Solutions at Quest Software. A Microsoft SQL Server MVP, he is a founding board member of PASS. More information about Quest Software is at