"A designer knows he has achieved
perfection when there is nothing
left to take away."
- Antoine de Saint-Exupery by way of A Word A Day I spent five long years studying the behaviour of electric fish at uni before becoming a professional programmer in the world of commerce. Not a day of my fish research was wasted - I've reused nearly everything I learned at uni in my business career.
Object technologists are not the only people who care about reusability. To prove it, we're going to apply the philosophy of science to your day-to-day business decision-making.
My colleagues and I had many a fine discussion about which theories had scientific value and which ones provided bull-fodder when mixed with a few mugs of brew. The short version: only theories that have both explanatory and predictive power are scientifically useful, because theories that explain but don't predict can't be tested.
Businesses deal with theories all the time. To their misfortune, businesses have only one way to test a theory - try it and see what happens. Sadly, the results still don't tell us much; businesses want to make money, not test theories, so they don't apply the kinds of experimental and statistical controls that lead to confidence in the results.
One perfectly valid business theory may be associated with marketplace failure (perhaps due to poor execution) while another - a really stupid idea - ends up looking brilliant because the company that followed it did enough other things right to thrive.
Although business theories are rarely as certain as, say, the laws of thermodynamics, they're often good enough to be worth using, provided they're useful and not just interesting. Good business theories must be useful, and that means they have to provide guidance when making decisions.
Client/server refers to a software partitioning model that separates applications into independent communicating modules. The test of client/server isn't where the modules execute - it's their separation and independence.
Distributed processing is a hardware and network architecture in which multiple, physically independent computers cooperate to accomplish a processing task.
You certainly can implement client/server computing on a distributed architecture (they go together naturally) but they are not the same thing.
I could almost hear some readers saying, "Oh, come on, that's just semantics", but the distinction matters. In other words, we're dealing with a useful business theory.
One of our teams used it recently while helping a client sort through some product claims. One vendor touted its "paper-thin client" - it uses X Windows on the desktop - as one of its desirable design features.
A thin client design was just what we were looking for, because we wanted to reuse a lot of the system's business and integration logic in new front-end applications.
Looking at the product more closely, we discovered something wondrous. The vendor hadn't implemented a thin client at all. It had built a fat client that mixed presentation and business processing, but had executed it on the server. The system used paper-thin desktops, not paper-thin clients.
Thin desktops may be just what you're looking for. They reduce the cost of system management and can give you highly portable applications. However, they do have downsides - the presentation waits to get the server's attention and in doing so puts a much higher processor load on the server.
We weren't looking for thin desktops. We wanted to reuse the application logic built into the system, and that meant disqualifying this particular product.
Take a minute to think about some of the claims you've read concerning the network computer. Ever hear someone refer to it as a thin client architecture?
I have, but it isn't any kind of client architecture. It's a distributed computing architecture. Whether the applications you run on an NC use thin clients, fat clients, or terminal emulators depends on how you (or the vendor) partition the application logic and where you execute the various modules that make up the application.