Cloud computing: The very definition of cloud computing remains controversial. Consulting firm Accenture has crafted a useful, concise definition: the dynamic provisioning of IT capabilities (hardware, software, or services) from third parties over a network.
Cloud computing is computing model, not a technology. In this model of computing, all the servers, networks, applications and other elements related to data centers are made available to IT and end users via the Internet, in a way that allows IT to buy only the type and amount of computing services that they need. The cloud model differs from traditional outsourcers in that customers don't hand over their own IT resources to be managed. Instead they plug into the "cloud" for infrastructure services, platform (operating system) services, or software services (such as SaaS apps), treating the "cloud" much as they would an internal data center or computer providing the same functions.
What is cloud computing?
Despite snarky comments among industry insiders and imprecise metaphors meant to explain virtualization and cloud computing to the masses, it is patently untrue that the corporate computing world is returning to the mainframe model of computing.
Today's hardware, software and networks are cheaper, more flexible and more accepting of anything a user or data-center manager wants to do, for one thing. Rather than making users wait days or weeks for any changes or reports, typical data centers can easily add extra storage or computing power to accommodate an online-sales promotion, for example.
On the other hand, constrictive budgets, a bad economy, and computing hardware that has largely outstripped the demands business applications put on it have increased the pressure on CIOs to not only show they're using IT dollars efficiently, but also actually do it.
Virtualization - as well as the cloud computing model within which it often runs - answer much of that need, by giving CIOs the ability to cover a week-long spike in demand by turning up the spigot on the computing power a business unit gets. A layer of virtualization software allows a bank of servers to share the available workload, and lets the CIO give a business unit 10% more storage capacity or compute power, rather than having to go buy completely new servers that add 10 times the required capacity.
The mainframe-like miracle is abstraction - the ability to hide the complexities of a system from the end user while providing all the power and capabilities the user requires.
The World Wide Web is the largest abstraction layer in IT - hiding the complexity of a global network with hundreds of thousands of specialized servers and arcane data behind search engines and hotlinks.
In IT, "virtualization" most often means server virtualization - in which one physical server acts as host to several virtual servers, each of which runs on a layer of software called a hypervisor whose job it is to parcel out storage, memory and other computing resources while making each virtual server believe it is running by itself on a standalone computer.
Cloud computing takes that abstraction one further step. Rather than making one server appear to be several, it makes an entire data-center's worth of servers, networking devices, systems management, security, storage and other infrastructure, look like a single computer, or even a single screen.
The idea is to let companies buy exactly the amount of storage, computing power, security and other IT functions that they need from specialists in data-center computing - in the same way they used to pay AT&T to come install the number of phones they required.