Pauline Nist has been around the block. She began her career at Digital in 1975 and moved to Tandem Computers in 1997. After Compaq purchased Tandem, Nist stayed on in her role as senior vice president of products and technology at Compaq's Tandem division. Nist discussed with IDG's David Pendery the changes in the computer industry during the past 20 years, the key trends occurring now, and what the future holds for enterprise computingIDG: Tell me about the computer industry from proprietary systems in the 1970s through to the adoption of standards-based, distributed computing in the 1980s and 1990s.
NIST: What made Digital a company was the PDP-11 [Digital computer family manufactured from 1970 to 1997]. It was inexpensive, and people walked up and touched and used it, rather than handing punch cards to somebody in a glass house.
Digital, in its early days, really grew very much the way Compaq grew in its early years. It sold through channels; the channels were the providers. It was in the middle years that Digital really made the decision to walk away from that OEM kind of play and tap the horsepower and the sophistication of the VAX.
You can argue about the pros and cons of that, but it obviously got them into a situation where they missed the whole workstation and PC revolution.
Compare and contrast the parallel processing, non-uniform memory access [NUMA], and symmetric multiprocessing [SMP] architectures.
How they've been used and deployed depends on software. [To] leverage to the broadest customer base, it has to be something that the broadest number of software providers can implement. That is what has made SMP so successful.
[But] both NUMA and parallel processing have as their allure these cheap, high-performance micro-engines, and if you can lash them together, you can deliver endless amounts of computing power. But the crux of the matter was that nobody made available a moderately high-volume application like a database engine for these platforms.
What were the most important trends and developments in servers and processing power during the past 20 years?
I think one of the important ones was time-sharing. One might argue that time-sharing created the appetite for the PC.
The next thing that came along was the microprocessor. Microprocessor technology took us one level further.
Then Unix set the stage for the "open operating system". Until then there was no notion of easily moving your application between vendors.
The combination of time-sharing, the microprocessor and the open operating system all set the stage for PCs, for people willing to accept a broad-based platform based on an industry standard.