No, wait -- 2006.
Storage virtualization, or the abstraction of the physical storage layer
"It's always been a solution in search of a problem and was never really compelling to users," says Stephen Foskett, director of strategy services at GlassHouse Technologies, a storage research and consultancy. "Survey after survey showed that virtualization was not top of mind for storage executives."
That could be changing, however. Virtualization technologies have evolved over the past several years, and major vendors, such as IBM, Hitachi Data Systems and EMC Corp. have trotted out new products and road maps. IBM says that it has reached the 1,000 customer mark on its SAN Volume Controller (SVC), for example, while EMC's storage router, due out in the second quarter of 2005, has excited a fair amount of advance interest.
Although some nagging questions remain, many vendors and analysts think that mainstream acceptance is finally on the way. "We think there's more reality this time," says Richard Villars, vice president of storage systems at IDC, a research company in Framingham, Mass.
William Hurley, senior analyst at Enterprise Strategy Group in Milford, Mass., said he thinks that market adoption of virtualization is about halfway there. He says that widespread implementation of networked storage has driven new interest in managing and provisioning storage efficiently, although the market is at least two years away from true mainstream acceptance.
In the meantime, however, bellwether companies are starting to buy. Technologies are more mature, easing buyers' anxiety about risking a relatively unknown technology. "The technology is now where it needs to be," says Brian Perlstein, technical architect at Oakwood Healthcare Systems in Dearborn, Mich., who is running SVC at his company's data center.
And there are finally some actual business drivers behind the new interest in virtualization. "Like many technologies, virtualization has had a clear technical meaning and an obscure business meaning," Villars says. Companies had to have the problems associated with managing burgeoning stocks of diverse storage get painful enough before they were ready for virtualization, he says.
That pain threshold has been crossed at many companies over the past several years, and many have backed into virtualization as part of strategies aimed at cutting costs and increasing computing efficiency.
One such strategy is that of tiering storage, or putting high-value data on extensive boxes and less important data on tiers of correspondingly cheaper storage drives.
"Virtualization could take tiered storage to the next level," Foskett says. "It could enable easy migration and data movement between tiers."
For example, Perlstein used his desire to implement tiered storage as part of his purchase rationale for SVC. "Tiered storage was part of my concept for purchase," says Perlstein, who currently has 30% of his 20 terabytes (TB) of storage behind SVC and eventually wants to virtualize everything. By using virtualization technology, Perlstein can move data between tiers and carve out virtual disks based on actual demand. He can then place the data in whichever tier corresponds to its business value.
Another popular business driver lies in utility computing initiatives in which companies attempt to build a computing infrastructure that responds to business needs on demand, much as an electric utility does. Part of this strategy involves implementation of virtualization strategies on both the server and storage side of the house as IT execs consolidate and partition data center servers and implement storage virtualization on the SAN.
At First National Bank of Omaha in Omaha, Neb., the company is using storage virtualization as part of a utility computing initiative that is aimed at increasing operational efficiency by centralizing delivery of IT services. "We had 30% growth of distributed systems and staff was growing at an alarming rate, says Michael O'Neil, vice president, enterprise systems at the bank.
To stem the tide, O'Neil has virtualized both the server and storage sides of the house, including a three-tier storage infrastructure with about 48 TB of storage. He says that the move has given him the flexibility to easily move data between tiers, as well as making management much simpler. "We've seen huge benefits in staffing," he says. "We had 27 engineers when we started, and now we have six through redeployment and attrition."
However, not everybody is convinced that storage virtualization will become ubiquitous.
"One of the problems with virtualization is getting onto it and getting off of it," Foskett says. "Once you're on, it makes migration seamless, but getting to it is a project, and getting away from it is another project. If the vendor goes out of business or changes its offering, you are out of luck. It's a redirection layer and you can't take it out of the way."
Indeed, choosing virtualization technology is potentially very risky, as companies must trust that the technology will stay viable for the long term, and that the vendor will continue to support virtualization as a long-term strategy.
"Who you buy your network controller and supporting virtualization software from will denote your strategic storage partner," says Villars, who likens the purchase as analogous to such strategic software choices as going with SAP or PeopleSoft.
"It's absolutely core to the decision," says O'Neil. "We had a lot of concerned engineers … who thought we were putting a lot of faith in one technology," he says. "There was a perceived fear of that layer."
About the author: Carol Hildrebrand is a frequent contributor to SearchStorage.com