SearchStorage.com caught up with storage management expert Brett Cooper and asked him to give us his thoughts on...
a few issues regarding virtualization. Brett shares his expertise on how to effectively utilize virtualization technology, how it will make data management easier for managers and administrators, the costs involved and what he's hearing from the field about the technology.
Editor's note: This Q&A is part of a larger Featured Topic on "How virtualization can make your job easier."
SearchStorage.com: How are users most effectively utilizing virtualization technology today?
Brett Cooper: To understand the answer to this question, one must first understand the definition of virtualization, which at best is subjective. Today, virtualization can be found in several places: in storage devices, in a host's logical volume manager, and even in some of the SAN switches that have recently come to market (Brocade and Cisco, for example). Some companies are beginning to offer dedicated "virtualization" engines that work on top of general-purpose computing platforms, such as Microsoft Windows or Linux; in each case the simple function is to provision storage resources from a virtual pool of storage. The goal of virtualization tools in such a case is to create this pool and then dole out the storage based on an application's needs today and into the future.
Virtualization is an industry term that has very little true meaning. (The category in which I am an expert is storage management, which has a true meaning. How do I judge if something has a true meaning, you ask? Well, it is simple. Go to the CIO of any company and ask that person, "What does storage management mean to you?" The answer is simple: "Anything to do with storage in my shop.") The goal of virtualization is a lofty one -- to put all of the various storage players at the same level and let the storage administrator dole out the storage resources based on the needs of the application regardless of the storage coming from X vendor or Y location. These lofty goals may not be reachable in a large enterprise shop without rearchitecting the environment, which may or may not be possible given the enterprise's function. Most enterprise storage shops are project driven, and these projects go through a process in which they are budgeted, evaluated, rolled out, backed up, archived and grown to fit new needs. They may never have the chance to be rearchitected as they cannot go offline, nor may it be possible to do a wholesale replacement of the technologies.
I once visited the data center of a large company that had been in business for over 30 years and was running some 20-year-old equipment right alongside some of the latest and greatest technology. I had thought that you could find some of that old equipment only in technology museums or IT recycling dumpsters but there it was, powered up and running. I don't think that today's virtualization technologies are going to be a cure for all of the issues of the enterprise; rather they are going to provide some meaningful solutions for selective projects. Those should be the goals of new technologies, solving issues and reducing the complexity of the IT organization.
SearchStorage.com: How is virtualization making data management easier for storage managers and administrators?
Brett Cooper: Virtualization is making data management easier by delivering a consistent way to manage the storage requirements of today's enterprise application, regardless of vendor preference or storage location. The virtualization tools, when properly deployed, can create pools of storage that deliver real-time access to storage as demands grow, all while utilizing the same back-end storage resources as before. The simplest example of this virtualization technology is a storage array that concatenates multiple disks together to make a large LUN that is exported to a host. The host sees the LUN as a single SCSI device and can reference it as a larger pool of storage than a single disk. Another example is a host-based volume manager that can string together disparate storage resources into a single dynamic disk. These host-based volume managers may dynamically grow the underlying volume that a disk lives on online without interruption to the host application. Several companies have announced virtualization solutions that are embedded in the fabric switches themselves, providing a pooling capability regardless of storage resource or host-based volume manager. When combined, these varied storage virtualization tools can deliver solutions to the issues of many enterprise customers.
SearchStorage.com: How can virtualization facilitate backups?
Brett Cooper: Virtualization, or the idea behind it, is critical to backups, whether it is splitting a mirror using an array or a host-based logical volume manager, or using snapshots. The idea of virtualization provides the capability to build backup architectures whose primary storage-serving role is not impacted by the backup process. I have had many customers tell me that their secondary or tertiary storage environment, rather than their primary storage, is their biggest headache. Why, you ask? Because the backup process using tape devices is an incredibly complex and error-prone set of tasks that is in need of an overhaul. I am not saying that tape is destined to disappear like some dinosaur but rather that tape has a definite, important purpose: that of long-term, offsite, bet-your-business archive. Evolving from a dedicated tape-based backup solution to one that employs tiered disk-based backup and virtualization can not only deliver cost savings, but also ease complexity and increase job satisfaction within the data protection team. (Note that I didn't say backup and recovery. These people are critical to the function of the enterprise and they are not tape floppers!)
Virtualization can eliminate the need for redundant copies of data and splitting mirrors by migrating data online from a primary data site to a secondary data site, where the data is backed up using snapshots and then archived to tape on a scheduled basis. I have been working with one large telecommunication company that is evolving its primary data storage to reflect the company's needs for a more streamlined backup process. The company recently instituted snapshots as its backup and is using a 14-day rotating snapshot schedule with an archive to tape every seventh day. This allows the company to save 14 days' worth of data change on 2.4 times the amount of storage space, right size its primary storage pool with mirroring (another older definition of virtualization) and reduce its dependency on error-prone tape in the process. The benefits of this new virtualized architecture are that the company gets a full backup each day and has been able to reduce its restore times, which are set at a 24-hour SLA, to under four hours as well as keeping its applications online through the whole process and avoiding syncing of any split mirrors.
SearchStorage.com: What costs are associated with implementing a virtualization technology? Is it easy to implement?
Brett Cooper: These are both very subjective questions. Costs can be quantified in terms of dollars and one can measure the time it takes to deploy a given solution associated as a cost with each hour, day, or week required, but as we recently learned when the bubble burst for the e-world, one cannot easily quantify goodwill or the sweat that a given solution will take to deploy. So, the answer is simple: it varies greatly. Most enterprise shops may be surprised to find out that they have already deployed various levels of virtualization -- they just don't call it virtualization. As I have stated before, I believe the "V" word, virtualization, is very overused.
SearchStorage.com: What are the reviews from IT professionals who are using some form of virtualization?
Brett Cooper The reviews that I have received are far ranging, from "we love it" to "not ready for prime time." Once again, we have very subjective reviews depending on the IT professional's goals in rolling out a virtualization platform.
I have found that IT professionals who do their homework up front and understand their enterprise's need for virtualization technology and then evaluate their choices based on the outputs from the analysis are usually much happier than those who have purchased a given solution based on a vendor's recommendation to try X solution out. Most IT professionals have not rolled out a full enterprise storage management solution -- rather they have elected to test various solutions within the organization to see what works and what does not. Several have selected OS-specific virtualization solutions, such as VERITAS Volume Manager (VxVM) on Solaris, since it provides so many unique abilities on the Solaris platform, but many have selectively avoided other technologies that require dedicated hardware or out-of-band management products. There is no one opinion about the above question. Ask several IT professionals for their opinion and you are likely to get a different answer from each one.
About our author: Brett Cooper is a Technical Marketing Engineer, at Network Appliance, Inc. and is a frequent speaker at storage industry events. During his recent tenure at Veritas Software, he was responsible for developing and delivering the first release of Veritas SANPoint Control, one of the industry's first storage management solutions. Brett was also one of the founders of the Veritas Press, where he acted as technical advisor for the well-known storage reference book, "Storage area network essentials - A complete guide to understanding & implementing SANs," by Paul Massiglia and Richard Barker. In Brett's current role for Network Appliance, he is responsible for delivering Fibre Channel Protocol (FCP) and Internet SCSI (iSCSI) solutions to the market.