Some vendors, like NetApp, claim storage virtualisation can double utilisation rates. The company is so confident of the technology’s value it is also offering a guarantee that customers will use 50 per cent less storage in their virtual environments as they would have with traditional storage. And if they don’t, the vendor is offering to pony up whatever further hardware is required to hit the promised figure.
“We’re that confident about it. It is that smart,” director of partner sales, Scott Morris, said. “With this kind of guarantee, resellers can boldly go to the market with a real claim to creating value for their customers.”
Thin edge of the wedge
Another powerful outcome of virtualised storage is the potential for its use, in combination with virtualisation at server level, to simplify DR. But the virtualisation technology with the biggest ramifications in the storage world is thin provisioning.
Traditionally, when provisioning a server, an IT manager would have to assign it a given quantity of storage. They would estimate how much storage would be required in the years to come, and would generally over-provision, save having to repeat the task again once the application is live.
Thin provisioning, by contrast, is a technology that allows for the “just-in-time provisioning of storage”, negating the need to have a whole lot of spindles being powered up and not being utilised.
The virtualisation technology, HDS’ Elisha said, takes the total pool of storage and divvies up the set of discs provisioned for an organisation’s servers into small (42MB) chunks. The servers take these storage chunks in increments when they need it. Such a technology saves on the purchasing costs of as-yet-unused disks spinning, consuming power and generating heat while waiting to be used.
HDS research suggests most organisations only use 35 per cent of their storage capacity. Using thin provisioning, a server uses that 35 per cent but assumes it has 70 per cent more.
“For a medium to large organisation with 20TB of capacity, that might equate to [short-term] savings of $500,000 to $600,000 capital expenditure over a four-year period,” HDS senior marketing manager, Tim Smith, said.
“The cost of storage is going down by 30 per cent a year, so every 12 months you delay buying disc is a saving of 30 per cent.”
However, thin provisioning is relatively new and not everyone is sold on its immediate value.
“We have evaluated it, but customers don’t at present have a demand for it,” Fujitsu’s Bell said. “It’s a relatively new feature set. It will become more useful to companies when carbon trading is on the agenda.”
Gartner’s Sargeant claimed virtualisation was “still expensive to implement” at storage level.
“A lot of organisations can’t quite justify putting it in,” he said. “It’s not as widespread in storage systems as people make out. It only will be when standards emerge and prices go down.”
While bullish about the power of virtualisation, even NetApp’s Morris qualified his enthusiasm for the technology.
“Virtualisation is not all things to all people,” he said. “There is almost as much benefit from pure consolidation.”
But whatever the technology you choose as appropriate for a client to help them deal with the data flood, most organisations are still in the early stage of understanding what to do with storage and how long to keep it for, HDS’ Elisha concluded.
“They want solutions to problems, not for you to meet your sales goals,” he said. “As a reseller you should be saying, ‘here’s something you might want to do, but don’t have to do’.”