50 per cent off storage

50 per cent off storage

Storage Insider

Storage vendors of every stripe have been feverishly working to hitch their wagons to the server virtualization juggernaut. Going far beyond the basic integration and certification activities that one would expect for support of a popular application, storage products are integrating management functionality and working to develop other ways to distinguish and differentiate their VMware support.

One of the boldest announcements came from NetApp this past week purporting to guarantee that users will use 50 percent less storage in their virtual environments if they deploy with NetApp gear versus a competitor. Now, before getting too excited it is necessary to point out that this guarantee comes with a list of conditions reminiscent of a TV prescription drug ad, and I suspect that competitors will have a field day with some of them.

Nonetheless, there are two important takeaways. NetApp is seriously stepping up their endorsement of thin provisioning as a key technology for virtualized environments, and, even more noteworthy is their support for deduplication as a primary storage technology. Until now, deduplication has been targeted almost exclusively to secondary data, particularly backup, but the commonality of data across guest virtual machines (think C: drives) is such that there is a potential for substantial savings.

However, we are very much in the early stage of the storage battleground inside the virtualization arena. VMware's recent announcement of their Virtual Data Center Operation System (VDC-OS), and specifically its vStorage and related components promise new fields of fertile ground for storage vendors to plow. Far more significant that mere integration with Virtualcenter, this will open the floodgates to a new range of VMware-friendly capabilities. Specifically, the vStorage APIs will lead to tighter integration of storage-related functions including load-balancing and other IO performance enhancements, thin provisioning, snapshot, replication, as well as more comprehensive capacity and performance management.

Ultimately, for those tasked with architecting virtualized infrastructures this is good news, but it means that current storage design assumptions and common practices that have evolved over the past few years will need to be reevaluated. It's a good time to step up long range planning and ensure a good understanding of vendor roadmaps in this area.

Jim Damoulakis is chief technology officer of GlassHouse Technologies, a leading provider of independent storage services. He can be reached at

Follow Us

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags storage backup

Show Comments