Newsletters




Is Virtualization 2.0 Ready for Mission-Critical?


An entire industry has sprung up in response to the never-ending battle against complexity, server sprawl, and rising power consumption. Virtualization is now the mantra for beleaguered data center managers looking for ways to consolidate, better utilize, or abstract away their farms of physical servers and hardware. However, in many cases, virtualization itself can lead to even more complexity andoffer uncertain value to the business. Many businesses are finding that virtualization is not ready for core mission-critical applications.

There’s actually nothing new about virtualization. “Most organizations have used virtualization technology for mainframe environments for over 30 years,” Dan Kusnetzky, president of Kusnetzky Associates and former IDC analyst, told DBTA. “Midrange system environments have been part of the party for 15 to 20 years. The newcomers are Windows, Linux and Unix. So, the typical data center is like a museum of computing. It contains technology from many suppliers, each doing something useful for the organization.”

The challenge is to get all this disparate technology from multiple suppliers to look and behave as if they were one. And virtualization is on the radar screens of a majority of enterprises. A recent survey from Evans Data found that 30 percent of the companies it surveyed are currently using virtualization in one form or another, and a similar number have plans for virtualization over the next two years.

“Virtualization is among the most disruptive data center technologies to emerge in years,” Alan Murphy, technical marketing manager for F5 Networks, told DBTA. “It is causing enterprises to rethink and rebuild their data centers from the ground up, and most enterprises are starting with OS virtualization.”

The new thrust of virtualization is to span the entire data center and enterprise, delivering up an entire enterprise architecture within a single, consistent service layer. Some even have given a name to this new thrust - James Price, vice president of product and channel marketing for DataCore Software, calls it “Virtualization 2.0," or total enterprise virtualization. Server and storage virtualization are at the core of this new paradigm.

The question is where to start, then? What should be virtualized, and why should it be virtualized? Kusnetzky said organizations have many reasons for going to virtualization, ranging from improving performance to scalability to better utilizing systems they have. Each reason requires different approaches.

Not Your Average Project

However, Kusnetzky and other industry experts caution that virtualization also can increase the very complexity it is purported to solve. And many solutions or approaches, at least initially, are more expensive than standard solutions.

Thus, making a compelling business case for virtualization often is the first hurdle to enterprise efforts. All too often, IT managers undertake virtualization as a one-time project, when it needs to reflect a long-term business commitment, said Scott Feuless, senior consultant for Compass Management Consulting. “Virtualization requires a fundamental change in the way the environment is designed, changed, grown, measured and paid for,” he told DBTA. “The planning that goes into the initial project has to be documented, proceduralized and developed into a core competency. If this doesn't happen the ‘project’ will not achieve meaningful results.”

The initial costs for virtualization tools and solutions may be a showstopper for some organizations. Return on investment and cost of ownership are two metrics that may also dampen enthusiasm for a virtualization effort. “Organizations are finding that virtualization is not cheap,” Alex Bakman, CEO of Vkernel, told DBTA. “An investment in new, expensive servers is necessary. While the long-term cost savings will allow organizations to recoup the costs and save additional cost going forward, that initial investment needs to be justified to keep virtualization projects rolling.”

“Management has to see how their monthly IT charges will decrease as a result of virtualization,” said Feuless. “If that doesn’t happen, they aren't likely to support the initiative.”

“The trade-off for abstracting a physical resource and presenting a virtual resource in its place can inhibit IT organizations from effectively delivering an application service to the business,” Sean Derrington, director of storage management and CDP/R at Symantec, told DBTA. “This can be an operational as well as technical issue.”

Virtual Machines "Free"?

Virtualization, in essence, introduces a shift in IT costs. Both Bakman and Feuless urge the adoption of chargeback models to help fund and cost-justify virtualization initiatives. “The inter-organizational departments the IT data center serves must be aware of the costs,” Bakman said. “They are no longer being hit with the cost of physical box when they want to deploy a new application server. This leads to the perception that a virtual machine is ‘free.’ IT needs a chargeback methodology to recover the cost associated with server virtualization.”

Feuless added that organizations still not have figured out how to quantify the results of virtualization initiatives. “Personnel savings in one area can be offset by increases in other areas,” he explained. “Hardware and software cost savings are typically not understood until the initiative is well underway. Traditional metrics like ‘hardware cost per server’ aren’t always relevant, because that number may actually increase with virtualization. It also becomes more difficult to relate utilization to costs.”

Virtualization requires a shift in the way server metrics are defined, Feuless said. “You need to start measuring processing delivered to the end user, rather than simply looking at hardware capacity. That's not easy to measure, and most organizations don’t have that capability yet.”

“To really change the cost structure of IT, and for IT to be more flexible to the needs of the business, organizations need to take a more holistic view by virtualizing all aspects of their data center,” Keith Millar, vice president of product management at Liquid Computing, told DBTA. He noted that this included all aspects of data centers, including storage, servers and networking. Convergence of communications and compute is critical for this to be possible.”

Security Issues

Virtualization also hits limits in situations where security is required or audited. “We found many customers who are doing virtualization in ‘non-production’ environments due to security issues,” Jay Valentine, vice president of TDI, which works with a number of government agencies. “In the government, our classified customers are not allowed by their auditors to use virtualization for any classified activities.” Virtualized IT resources are also problematic in environments regulated by mandates such as Sarbanes-Oxley and HIPAA, he added.

Virtualization strategies are risky now from a data security standpoint, Valentine said. “This will change as security catches up. The problem right now is that all existing security was made for non-virtual systems and requires ‘agents.’ As agentless security systems become available, this will fix the problem.”

Performance Issues

Even when it is limited to non-critical applications, virtualization may also tax system performance, creating more issues within data centers. Increased utilization is realized at the expense of performance, Ram Appalaraju, vice president of marketing for Azul, told DBTA. “Today, a vast majority of virtualization projects are deployed for non-performance-critical applications,” he said. “Traditional virtualization solutions do not address performance needs as they usually slice CPU cycles and memory resources to host multiple copies of the operating systems and support multiple applications. While such approaches somewhat improve utilization of the systems, they are not appropriate for applications that need dedicated CPU and memory resources.”

F5’s Murphy agreed. “Virtualization moves many I/O tasks tuned for hardware back into software,” he said. “A virtualization translation layer is now responsible for translating that optimized code for the software CPU then back to the physical CPU running on the underlying hardware. I/O intensive applications, like databases, don’t fare well when virtualized because of this translation layer.”

Performance management also is an issue that increases as more of the enterprise infrastructure is virtualized. As Kusnetzky put it: “Failing to plan is planning to fail here. Each type of technology must be used appropriately, or the results will be less than expected. I've seen many companies go into the realm of virtualizing some portion of their industry-standard system operations without a plan, and, in the end, they find that their issues with performance, reliability, and managing complexity only increase.”

Virtualization “compounds the performance management problem significantly,” Steve Henning, vice president of marketing for Integrien, told DBTA. “Now you have to deal with the physical server, the hypervisor, the guest VMs and most importantly the applications or application components running in the VMs. Virtualization makes it harder to see problems that could bring applications down and the new components that are byproducts of virtualization technology produce more performance data that will need to be analyzed by IT staffs that are already inundated with time consuming manual correlation processes.”

“We have seen some firms try to consolidate servers by simply installing a virtualization platform and adding servers to it,” John Biglin, CEO of Interphase Systems. “Though that approach may work for some, it can dramatically increase the risk of an inappropriate -- and underperforming - implementation.”

Biglin added that “with the proper planning and implementation, the challenges can mostly be mitigated and the benefits typically outweigh the challenges.” Henning also suggested that the rise of mixed virtual and non-virtual environments would require “sophisticated, real-time analytics to reduce the massive manual effort of managing this complexity and allowing a proactive approach to problem resolution.”

Many experts agree that virtualization strains the infrastructure in a number of ways. Additional storage requirements is another challenge that adds to the initial costs of virtualization. “Virtualization does require more storage, and the initial investment can be costly,” Vince Biddlecombe, CTO of Transplace, told DBTA. “Enterprises need to make sure their server infrastructure can handle the load in terms of storage, and network capacity. A significant challenge is that there isn’t really a holistic reference architecture to rely on that encompasses storage requirements and best practices for server virtualization. Enterprises need to experiment on what the best ratio of servers to applications is for their environment. Done correctly, virtualization simplifies the architecture, but in the short run it can be a little more complex as you determine the design of your architecture.”

Kusnetzky and others hold out hope that organizations will increasingly recognize the advantages of virtualizing over attempting to maintain sprawling data infrastructures. “Organizations have been seeing benefits in the mainframe and single-vendor midrange systems for decades,” he pointed out. “Now they're looking to replicate those benefits in the area of industry standard system-based application systems. If they've developed a good plan and then executed it well, they do find the cost-reduction, better utilization, better performance, etc., that they were seeking. Those who have gone on the journey without a map often end up somewhere other than where they wished to be.”

“Like any other disruptive project in the data center, virtualization should be well-planned before execution,” said Murphy. “The benefits should be weighed against the challenges, and the data center should be treated as a complete system. Rushing to implement one type of virtualization without first considering the ramifications across all parts of the data center will result in, at best, a less-than-stellar deployment and at worst, an expensive failed experiment.”

However, over the long run, virtualization takes data centers to a whole new level, many believe. Virtualization “is the next step in the evolution of technology, and has really allowed us to break the chains of hardware and enable the ability for services to be anywhere in the infrastructure they are needed - and not merely where they were initially deployed,” according to DataCore’s Price. “This is a new concept, as we’ve always thought of infrastructure in terms of hardware, and not in terms of services.”


Sponsors