One of my pet column topics would have to be quality and the longevity of systems. Quality, I think most of us would agree, isn't at the level that it should be in our industry, and systems tend to last much longer than we imagine. It's a situation that invites disastrous consequences.
There are some that would call the Y2K problem one of those consequences, but I'm not quite ready to go there, because the problem stems more from longevity than from quality issues. As much as vendors would love to do something about the longevity "problem", it's not likely to go away. That leaves quality as the major area where we can - and should - make some real improvements.
I recently had lunch with a practice leader from a very large integrator. Talk turned to the methods integrators use to ensure that their products and services maintain high quality. We touched on some of the major quality initiatives of the last decade or so - Baldridge, ISO and TQM - and agreed that all had problems when it came to measuring and ensuring service quality.
In general, statistical quality methods like Baldridge and TQM are fine-tuned for manufacturing. Checking widgets for size tolerance is not the same as figuring out whether the business-process re-engineering you've just proposed will, in fact, make life better for the customer. I wasn't surprised by this part of the discussion, but I wasn't quite prepared for what came next.
"Programming is really more like an art than a science," he said. "I'm not at all sure you can compare what goes on in software with projects like building a bridge." The comparison put me off a bit, because it seems to contradict so much of what we like to tell our customers, and ourselves, about the business we're in.
Computer science is the academic discipline many of us study. Business-process re-engineering is the source of so many service dollars, especially since the arrival of ERP and the Web. The language we use and the methods and processes we employ all come from the older technology disciplines of engineering, hard sciences, and quantitative analysis.
How is it, then, that we can take these ingredients and turn out products more akin to impressionist painting than civil engineering?
I think part of the answer has to do with the industry's age. In most other technology disciplines, standards of quality are formed from the statistical results of doing the same sort of thing many times. In our case, the one-off model is all too prevalent.
A larger part, though, has to do with the romantic image we've fostered about computers. We're not part of some large corporate machine - we're technology cowboys (and cowgirls), staring down the bad guys under the blazing sun. We're living on "Internet time", and we'd rather do things fast than do them right. And let's not forget the wonderful things we do are so darned complicated that it's simply impossible to build them without errors.
I think it's high time the industry got serious about quality. I'd like to know what your firm does to make sure that its programming and service are high quality, and I'd like to share the programs that work for you. Hopefully, you can provide some insights into how best to ensure quality - and how we can all write better software and build better systems.
Curtis Franklin is executive director, technology, at ARN's US sister publication, Solutions Integrator. He can be reached at email@example.com