PHOENIX -- Virtualization is a decades-old concept with mainframe roots and an open systems future but, when it comes to investing in the technology, users continue to wait for a big green light.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
"Virtualization is real; virtualization has been real," said John McArthur, a group vice president with International Data Corp., during his industry update presentation at this week's Storage Networking World 2003 conference. He referenced one user who left the storage market in 1983, when virtualization was a hot topic of discussion, only to return 20 years later and learn that it was still under scrutiny. "He figured it would be done by now," McArthur said.
Instead, many users are still waiting for more clear-cut definitions and proven solutions in the open systems market.
"Having a single point of administration -- that's huge," said Stephen Serbe, technology architect for E. & J. Gallo Winery in Modesto, Calif., explaining that he wants and expects to get such a benefit from virtualization.
But, while Serbe said he has seen virtualization for years -- virtual memory is the basis of all of his company's servers -- what he hears now seems more like vendor marketing hype. Serbe wants to know the specific value that virtualization will provide. For instance, he doesn't want to have to put more agents on every server just to know what's going on with his data; he's afraid of creating server downtime.
"We're in the business of producing wine. We're not a test lab, and more often we've felt like the latter," Serbe said.
Another user agreed that virtualization sounds like it will offer the technology he needs to better manage his environment but, first, the storage community needs to understand what "virtualization" means and how to get the technology to work in an open systems environment.
"I'm very intrigued with this notion of virtualization of storage," said Newton Munson, director of information technology and networking services at the Virginia Institute of Marine Science, which is part of the College of William & Mary. He recalled virtualization being very stable on DEC platforms but, in those cases, he only had VMS to worry about. Today he works in a mixed Solaris and Windows environment, where he needs to be able to distribute potentially multiple terabytes of storage as needed for marine research projects.
"Virtualization has been a buzz word for about three years now, and you don't see it around much. But it is gaining momentum," Munson said. If the economy improves and vendors could start throwing money at the technology, he said, virtualization efforts in open systems may ramp up, or so he hopes. He's already added the technology to his "covet list" for when he has the budget to invest.
"Six months from now, the number of people using virtualization will be staggering," said Tony Prigmore, a senior analyst with Milford, Mass.-based Enterprise Storage Group, during his storage industry primer.
"Virtualization lives across all three layers, from a storage management perspective. It's a critical abstraction area," he said, meaning that by separating the presentation of storage to the server operating system from the actual physical devices, it enables you to perform asset and performance management; data valuation and resource optimization, and data protection and retention.
One of the main reasons that virtualization hasn't taken off yet is because the virtualization niche has been dominated by small vendors, according to Prigmore. Many users are looking for offerings from the bigger vendors, which they often have more confidence in. Prigmore said that users can expect to see those big vendors pushing virtualization by year-end.
IBM Corp. is one such company. It is planning to offer its TotalStorage SAN Volume Controller for block-level virtualization and IBM Storage Tank for file-level aggregation.
Mike Zisman, IBM's vice president of corporate strategy and a speaker at the conference, said that virtualization improves business continuance and efficiency. However, he added, "The full benefits of virtualization can only be realized through the use of common, open interfaces."
Not enough users today are doing virtualization on the open systems side, the IDC's McArthur said, explaining that that's where it makes the most sense, since users spend most of their money and IT resources there.
McArthur labeled virtualization "disruptive technology," using the term defined by Harvard Business School professor Clayton Christensen. That is, it's a technology that outpaces the ability of most users to absorb it. However, McArthur said, today's market is ripe for such disruption.
One user who is "disrupting" his data center is reaping the benefits. Rod Lucero, chief architect for Conseco Finance Servicing Corp. of St. Paul, Minn., shared his case study at the conference. By using DataCore Software's SANsymphony in a mixed Windows/Unix environment, he was able to allocate storage in one hour -- as compared with four days, which is how long it took previously. He was also able to enhance availability and increase utilization from 55% to 85%.
Such benefits illustrate the real value of virtualization and prove that, as McArthur said, "You need to disrupt yourself, because disruption is inevitable."
FOR MORE INFORMATION: