Menu
The end of corporate IT

The end of corporate IT

You called your article The End of Corporate Computing. Why?

Nicholas Carr (NC): Up till now, it's been assumed that companies have to own the basic assets involved in computing. I think we are moving to a time when that assumption will be overturned and those assets will begin moving from within companies to more centralised utility suppliers. It's a shift similar to what we saw 100 years ago, when all manufacturers maintained their own electric generators to power machinery. Over 20 or 30 years, they shut down those generators and began to buy electricity from utilities. Just as today we wouldn't talk in terms of corporate electricity generation, I think tomorrow we won't talk in terms of corporate computing.

There has been lots of discussion over the past few years about utility computing. What's different about your take on it?

NC: I try to look at the economics of business computing as opposed to the technology of utility computing itself. I argue that up till now, a lot of the utility computing discussion looked at isolated instances of hosted applications, like Salesforce.com or one company hosting another's websites. It's easy to believe this is a fragmented phenomenon that will have a bunch of companies providing a limited number of outsourced services.

I believe it's a much bigger wave of change in that today's entire model of business computing is built around fragmentation of basic assets - everyone having to buy what, in many cases, is similar equipment and software. All that stuff will ultimately be centralised outside companies, and that will lead to much greater efficiency that will translate into lower costs and greater reliability for users.

Assuming you're right, this is more of a gradual evolution than a sky is falling event, right?

NC: Absolutely. We're not going to wake up tomorrow and get all our computing requirements through a socket in the wall. It will take a couple of decades for this to roll out. It's a matter of utility suppliers slowly building up enough scale and enough expertise that they can replace ever larger internal data centres.

It tends to start with smaller companies that find it difficult to buy and maintain their own systems. Those are the first ones to move to a utility model. As the utility model gains greater efficiency, it will get scale advantages over larger corporate IT functions.

The utility model brings dependence on a single vendor, which reasonably worries IT folks. How would you keep the utility honest?

NC: That's a good question, because beyond the interests of individual users, there's a danger of too much of this very important infrastructure falling into the hands of too few companies.

It's critical that there continues to be competition both at the level of the utility and of component suppliers to the utility. Don't think hardware and software companies will go away; they'll just shift from supplying the user to supplying the utility company. So it's critical at the highest level to ensure strong competition between all those parties. Eventually, as with electricity, it may require the government moving in to ensure that there isn't too much consolidation.

At the individual company level, there are certain risks involved in consolidating your assets with one supplier, but also considerable gains. Ultimately, those advantages of getting rid of the responsibility for expensive, finicky assets will come to overwhelm fears of letting somebody else run this.

Looking at the electricity analogy, electricity doesn't involve the kind of security risks inherent in data transfer. How does security fit into this picture? I think that ultimately centralising control over a lot of the basic IT infrastructure will actually increase the level of security over the current highly fragmented and distributed model. Where IT is more distributed, it's more vulnerable in many ways. One of the advantages of a utility model is that the entire success and fate of the utility hinges on its ability to maintain security.

Having said that, there are certainly different security issues when you have consolidation of data, and at a technology and policy level, it's going to take some innovations and advances to get to the level of security necessary for really large-scale utilities to emerge. But over time, economics will drive those and it will happen.

You say an outside supplier will take responsibility for all of a company's IT requirements - from infrastructure and storage to applications. Isn't that like expecting the power company to also supply your light bulbs, TV and vacuum cleaner?

NC: Not really. A key difference [between electricity and IT] is the number of layers of applications, and I don't mean just application software. With electricity, you had generation and uses that had to take place locally - like the vacuum.

With IT, there's the basic infrastructure, then a layer of application software that can increasingly be run remotely. Then how the outputs of that application software are used by companies - that's the vacuum layer that will stay local.

Companies will still have to figure out how to best use the information in software applications and how to adapt processes and do all the stuff that you need to do today. The difference is that someone else can worry about all the underpinning.

In your vision, does anything recognisable as IT still exist?

NC: Under this model, what we now call an IT department is unlikely to continue to exist in its present form, but I think you'll still need people that combine deep technical knowledge with strong business and process knowledge, because there is still going to be a need for that person who can translate everything you're buying from outside providers and interface that to your own processes. If you make the assumption that recently IT departments have begun to shift to more of a process and business focus, in some ways this will be a continuation of that shift.


Follow Us

Join the newsletter!

Error: Please check your email address.
Show Comments