Attendees at our roundtable were convinced the cloud computing concept wouldn’t truly gain widespread adoption until standards were developed. But just what kinds of standards are required? And is this feasible, or just utopia?
Ethan Group’s Shadi Haddad suggested we simply needed an “acceptable” standard.
“Take those wars between Beta versus VHS, or HD-DVD and Blu-ray: If you look at what won, it was the inferior technology, but an acceptable standard. It was good enough and affordable enough for everybody,” he said. “Virtualisation on the server market is a perfect example of that – industry-standard servers is what we class to be x86. It was never good enough to run mission-critical applications, because banks didn’t think it was an acceptable standard. But you start to get better resource utilisation, redundancy and fault tolerance through virtualisation, and all of a sudden it’s an acceptable standard. It’s as available as AIX or Unix boxes.”
Dimension Data’s Ronnie Altit agreed IT was full of technology standards that were just good enough.
“That’s IT in general – once a product does 80 per cent of the functionality and it’s cheaper, people think that’s good enough. You see it with packet director/e-directory, SQL/Oracle, Exchange/Notes,” he said.
Ultimately, it would be applications that influenced the standards battle, VMware’s Andre Kemp said.
“How many applications are there for the Apple iPhone? If you build it, they will come. But the standard is the iPhone,” he said. “With standards comes proliferation. The battle of Beta and VHS is a classic one. It was lower cost and proliferation that made VHS the standard. The smart money is to see how this shakes out for a while and how things firm up. I firmly believe the applications stack will drive cloud computing directions, whatever that may look like.