It's time to take a look at server virtualization

Virtualization in the data center is nothing new. But thanks mostly to the server consolidation craze, the concept is just now really making strides.

Virtualization in the data center is nothing new, of course; it's been around on the mainframe since the '70s.

But thanks mostly to the server consolidation craze, the concept is just now really making strides on Windows, Linux and other more recent denizens of the glasshouse.

Simply put, virtualization means that one computer can run multiple operating systems or applications "under the covers." A related idea is allowing an application to use more virtual memory than what is physically available.

Thanks in great part to VMware Inc. (now owned by EMC) and products from Microsoft, virtual machines can exist on just about all the standard operating systems and platforms in the data center, from servers to storage.

With these strides, however, come new challenges in terms of understanding and managing the technology. Just having some background in the mainframe definitions and implementation of virtualization does not guarantee automatic success when spreading the notion around on other platforms, according to observers. Tools have changed and training on specific platforms is required -- even for long-timers.

Conversely, people who may have been brought up on PCs and various Unix variants may not know anything about the concept and will likely need to be taught the basics.

Despite these issues, it is clear that virtualization is a force that will grow only stronger in the data center. Dan Kusnetzky, vice president of system software research at International Data Corp.'s, said virtualization software as an entire category is expected to grow by around 19.5% per year, to reach around $14.5 billion in 2008.

Two overall trends account for this growth, he said. Organizations are "looking for ways to cut costs" and IT "must prove real, demonstrable business value," he added. Virtualization accomplishes both goals, by allowing the consolidation of many physical machines into a handful that can run at least as much work, if not more.

"What's new is the confluence of the individual trends and how they play off each other," Kusnetzky explained. In addition to understanding the nuances of each server platform's twist on virtualization, storage needs to be added to the mix. Furthermore, advances in hardware -- like blade computing or clusters in a box -- can mean even more complexity in how the concept plays out.

For these reasons, most observers advise having a well-crafted plan before implementation. "You need a strategy with respect to the data center or at least a piece of the data center," said Mike Kahn, managing director of The Clipper Group, a consultancy in Wellesley, Mass. "This isn't something you do casually. Sure you can play with it in the lab, but you need to understand the potential of partitioning and the nature of the application you're dividing up."

There are, he said, architectural decisions to be made, such as whether to break an application into smaller jobs to help it run more efficiently across various processors. Or perhaps the answer is to dedicate multiple processors to the entire application, depending on the size and peak performance requirements of the application.

Another cautionary note is one of software costs. Many customers are exploring virtualization to help offset hardware costs by realizing the maximum potential from each machine. But users need to do some research into related software costs. Some vendors count each virtual machine as a separate processor, where others count only actual physical processors or the number of concurrent users. Just be sure your software costs won't rise more than the amount you're saving on hardware, Kahn said.

There are other approaches to keep in mind, said Frank Gillett, principal analyst at Forrester Research Inc., in Cambridge, Mass. "What's interesting about server virtualization is that it lets you isolate one application from another, so they won't interfere with each other."

One upcoming approach to this is the container model used in Sun's soon-to-be-shipped Solaris 10. "It lets you isolate applications within the same copy of the operating system, and that works as long as all the applications are happy with the same level of the patch," he said.

Other options are workload management software and rapid provisioning, which lets you run one application for some period of time and then run another across the same servers at a different time, Gillett said.

With all the benefits of virtualization, it's important to remember that it's only a first step toward data center automation, Gillett said. "Virtualization still leaves you with umpteen different servers to manage. You still have to worry about patching, upgrading and so forth." He said it's critical to use virtualization with data center management software to help control the chaos and to be able to maximize the use of physical as well as virtual machines.

Overall, Kahn said, "People need to carefully think about this before they dismiss it as not being important or before they really dive into it. They need to have a plan."

About the author
Johanna Ambrosio is a freelance writer based in Massachusetts.

For more information:

 Virtualization in the data center 

Expert: Virtualization coming to Windows, Unix shops

Linux and french fries -- a real happy meal

This was first published in November 2004

Dig deeper on Virtualization Strategy

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

SearchSolidStateStorage

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

SearchStorage

SearchServerVirtualization

SearchVirtualDesktop

Close