I've never considered myself a technology Luddite, but now I'm not so sure. One of the defining points of this past decade was the so-called paradigm shift to client/server computing. The term started to appear around 1990, not long after open systems had hit the big time. What we had in those days looked more self-serving than client/serving. Every supplier was suddenly doing client/server, often with the same products they'd been selling the year before.
So, mired though we were in recession, it felt like a period of real optimism had begun. Here at last was the long-awaited way forward which made sense of it all. It gave the stamp of corporate respectability to the troublesome PC. Just as importantly, it offered a glimmer of a future to the shaky old-guard which had been struggling to find a new identity. Their business was now about servers and services and they could co-exist with the PC rather than compete with it.
Whether it was vendors pushing or users pulling, the IT world seemed enamoured of client/server. Users hesitantly, slowly, started to plan for it in their futures. And the trickle, as they say, became a flood. And that's where my problem started.
I had these two pictures in my head, but I found myself thinking that the "after" picture didn't look much better than the "before". I believe now that we've been deceived and the spoiler in this idyllic picture has been the true cost of PC ownership.
It's an insidious cost that's often remained buried, but it's increasingly coming to the fore. And it goes beyond the hardware and software to the enormous labour costs, both overt and hidden. It has meant we've shifted costs, even added them, rather than replaced them, and it's robbed users of any real gains that might have been delivered by the microprocessor revolution.
A few months back, I read a report in which an MIS manager lamented the fact he'd been forced to throw out hundreds of cheap, simple terminals and replace them with PCs. The reason was that development of his old software package had been phased out in favour of a slick client/server model.
Most of his users on the factory floor, in the stores, at the customer services desk, in the accounts department, didn't need PCs and were positively handicapped by learning them. The new system was no doubt a great benefit to a handful of people in terms of flexible reporting and set-up. More importantly, it looked great in the demo, so clearly the software developer wasn't going to drop it when the crunch came and one system had to go.
In this drive to "flexibility, power and user-friendliness" that the new paradigm claims to offer, we've lost something along the way.
The transition to the current crop of office applications is similar. The real cost is huge when you factor in hardware, training and support (including the informal support networks of "office experts" that inevitably pop up). But the productivity gains aren't so obvious.
While the eventual move is inevitable, I'm sure many users are upgrading because they see the end of their "old" systems rather than genuine, and cost-justifiable, productivity gains from the new generation. Typically, too, where new features are useful to a handful of users, the gains are lost by the need to upgrade all users so those few can take advantage of them.
As a PC user (indeed, enthusiast), I'm getting less and less incremental benefit from each upgrade. As a CEO, I'm seeing a diminishing justification and am much more aware of the real costs of those upgrades. I'm becoming a technology Luddite.
I'd get a bigger productivity boost from a slightly smarter e-mail package than I would from more features in my top-of-the-line word processor or spreadsheet. Most of my staff would too.
So when the cost starts to outweigh the benefits, when expensive upgrades are forced by the technology itself rather than by genuine productivity gains, and when we find ourselves getting pushed into marginal applications for which PCs aren't quite suited, it's time to sit back and take stock.
I'm not predicting the death of the PC. That would be like predicting the death of the mainframe (and yes, we've run a few of those predictions over the last 10 years). It hasn't happened. It won't happen. The PC has a long and successful life ahead of it. But many new applications are about presentation and communication, not processing. They need useability and a cost of ownership that's closer to a TV and telephone than a computer.
A PC on every desk and a PC in every home won't happen. And that's good. Let PCs do only what they do well. We've reached a point where users - whether in the office or the home - are making too many compromises, living with too many shortcomings, seeing too little return in exchange for the high and real costs.
I now understand how Bill Joy felt when he despaired at the direction the computing world was taking. He went bush for a couple of years to change it, and the product he eventually came up with, Java, is on the verge of shaking the very foundations of computing.
It promises choice, variety, easy communication, the chance to build the right tool for each job. And as computers drive into every corner of our work and private lives, those qualities are essential.
I've seen the past, and it's still a half-met promise. The PC has got us off to a start, but the promise of client/server, with its plethora of intelligent communicating terminals, won't be delivered by a one-size-fits-all window into this world.
We're in for another rocky ride, but maybe this one will deliver businesses and wired homes alike the other half of the client/server promise and the real benefits, not just the costs, that will come from it.