It's fairly well accepted that servers are pretty dull little pieces of equipment, unlikely to stir much interest except in the event of their failure and subsequent power to frustrate and annoy. Let's face it, if there was a prize for the most abused explanation for every damn technology glitch in the modern office it would have to be "the server's down".
Yet in the background of this often unjustified ill feeling, the results of several years of quiet toiling by the major vendors to improve their respective platforms, boost processing power, deepen storage, tweak security and so on, are now bubbling to the surface. The post-Y2K, GST race to catch the big fish, be they dot-com or dot-com savvy, is on.
According to David Booth, Asia-Pacific product manager for Hewlett-Packard, servers, one of the prevailing trends in the market at the moment is the move towards "more robustness with clusters or greater redundancy in systems design". It's what many agree is the return to the old "white coats and glasses" values of computing.
Increases in the speed of CPUs is also a major factor and one which is certainly fuelling development of low-end, PC-based servers, he adds.
Further, he predicts that storage will increasingly play a central role in the server market, with the evolution of storage area networks and more complex solutions using RAID or other technologies. But while the technology thumps ahead, the evolving electronic business environment is changing how companies view and purchase their server solutions.
With the inherent unpredictability of technology projects these days, especially those directed towards the high growth yet volatile "dot-com" sector, many companies are opting for more of a "utilitarian-based supply structure", Booth said. In other words, the market and indeed the channel are being forced to accommodate a pay-as-you-use model, or mini-outsourcing model as an alternative to costly capital investment.
It's an interesting point, according to market analyst IDC's Graham Penn, and one which goes straight to the heart of the service conundrum in the channel.
"Most dealers are only capable of handling e-business for small organisations," he said.
It follows then that once companies reach a certain point they will outgrow their provider unless they are able to scale in pace with the growth in the business.
"The channel will either have to grow its capabilities or watch their clients leave to spend their money with the big service companies," which have already flagged their plans to trawl for business nearer the forest floor.
"Resellers that stay stuck in this product mode will ultimately fail in the server game - users don't have the capability or time to go it alone," Penn said.
Andrew Wilson, consulting systems engineer with one of Australia's largest resellers, BCA IT, agrees. "Availability, robustness and 24 x 7 support are all things coming back into fashion," he said.
As a reseller, he adds, BCA IT must look very closely now at precisely what a company needs to best fit its stated goals. And it's not easy. His time is split equally between the role of pre-sales support engineer and as consultant advising companies on the best solutions.
While vendors are locked into a race to deliver the "easiest-to-build" solution, this is creating a complicated landscape for users in the market for a solid server/storage combination.
"There's a lot of empty promises in the market at the moment," Wilson said. While declining to point the finger at any particular companies, he stresses there are a lot of "smoke and mirrors".
"In a way we are in the lap of the Gods," Wilson said, adding that smart resellers can get a headstart in developing the services side of their business.
People are certainly a lot more computer literate these days and increasingly organisations know what they want "but they don't know how to achieve it", Wilson said.
"In today's dynamic business environment, largely driven by the desire for organisations to be or to be seen as "e-ready", server buying decisions are being based more than ever on long-term corporate goals," according to market researcher GartnerGroup analyst Mathew Boon.
He stresses that server solutions must now deliver some sort of ongoing investment protection. With the inexorable drift towards 100 per cent computerisation within many companies, vendors and channel players have a greater responsibility to get it right first time and for ever after.
"The key to all decisions is how much the purchaser can leverage from the technology to ensure it meets their needs not only today, but two, three or even five years from today. The capability for server technology to grow as a business expands is a key decision factor in any IT purchase decision," Boon said.
Key to this, he believes, is the integration between new solutions and what a company already has.
"Organisations not only require, but demand solutions that fit their business model, not that of the vendor or reseller. They don't care where the components of that solution come from as long as it can be integrated within their existing infrastructure whilst providing the platform for future growth," Boon said.
The performance race sees Compaq in the lead at the time of publication. But, as Penn points out, it is "a game of leapfrog that is seeing users spoilt for speed".
Stevan Caldwell, server product manager for the AlphaServer at Compaq, is bullish about the future of the technology it acquired from Digital in 1997.
Compaq's commitment to maintaining the platform has placed it in a commanding position to take exactly that high-end business Wintel is so desperate for, he believes.
Now that everything old - "no downtime, no question" - is new again thanks to e-commerce and the unstoppable "Webification" of everything, Compaq is reaping real value from its original $US9 billion or so investment in Digital.
In terms of its top clients at the moment, few come as high profile as Celera, the wholly commercial half of the two-horse race to map the human gene code.
In an article in this month's Time magazine, it is stated that one of the reasons Celera appears to have pulled away from its government-sponsored rival was its ability to sequence code much faster. The reason, according to Caldwell, is that the Celera team is using AlphaServers.
Not to be outdone, however, IBM has its work cut out for itself in less than two months when it expects to process more than 100 million hits per day for the Sydney Olympics.
Both examples illustrate the staggering levels of performance being achieved by the top vendors in the high-end server space. The relevance to medium-and even small-scale companies that may feel out of this league, is the breathtaking improvement in price performance, which is increasingly placing such power within easy reach.
Just two weeks ago, IBM announced a chip technology that it said can effectively double the memory in a PC server. Called Memory Expansion Technology (MXT), the new memory-controller chip will reside between the processor and the main memory on a server's motherboard. Products featuring the new technology are expected to appear in Australia late this year and early 2001.
But even the least tech-savvy user now knows that speed means nothing without the other pieces of the puzzle.
There are loads of applications to sift through. There are new security concerns brought about by things such as the Love Bug and then, ultimately, there is the issue of what you do with all the data once you start collecting it.
"Coupled with the decision on which server platform to invest in is what storage capabilities that vendor or, if compatible, what storage solution can be purchased from a third-party vendor. Although we are not quite at the stage where the storage component of a wider solution is what drives the decision, rather than the server itself, the market is certainly heading in that direction," Gartner's Boon said.
According to IDC's Penn, "e-business is driving the storage business." And this, according to BCA IT's Wilson, is driving a massive boom in knowledge management.
Nowadays, he adds, with the heightened need for transactions, data warehousing and analysis, it is not unusual to see companies buying a terabyte of storage per month.
According to recent IDC figures, Hewlett-Packard is second in the Australian market for Intel-based servers behind Compaq, but number one overall throughout the Asia-Pacific region.
Fundamentally, the Australian server market is really down to these two players, IBM and Sun Microsystems, according to IDC Asia-Pacific head Graham Penn.
But despite what may seem slim pickings for the channel. With the development of the super powerful Alpha platforms, huge progress on the Intel processor and IBM's success in maintaining its four core technology platforms - not including the newly acquired Sequent technology - never before has there been so much choice.
But resellers have their work cut out for them. As Penn points out, organisations now have to gear their IT management to be reminiscent of the old days where 24 x 7 service and what many now call "bullet proof computing" were not negotiable.
"What you have to do all the time is cater for the peak load," Penn said, adding that even the smallest companies will find it difficult to escape this new imperative.
This is something IBM sees itself especially well placed for. "In addition to improved memory technology and faster CPUs, we are also seeing a move towards fully switched fibre in the server to better deal with high-transaction volumes," Penn said.
"IBM's server technology roadmap is stronger now than it has ever been," according to IBM server product manager, Stephen Sherry.
Big Blue now has no less than five separate server platforms that cater to just about any need.
In addition to its faithful RS/6000 and MVS, IBM has defied critics of its AS/400 platform by preserving and promoting the platform as a key pillar of the company's e-business strategy. Sherry added that IBM's Intel-based Netfinity servers are selling strongly despite being only third in Australia.
But it is the company's newest addition, acquired last year with the acquisition of Sequent, which Sherry believes will have the greatest impact in the near future.
"Sequent's Numa-Q architecture will become one of the key building blocks for IBM servers in the future," Sherry said.
Its primary benefit is the ability to run Intel-based Unix and Windows-based NT concurrently on the same machine, an important feature given the ongoing struggle for consolidation amongst big IT houses.
"When we talk about this so-called new computing model', the strength of the products and services offered by IBM and its partners has been seriously underestimated," Sherry said. "IBM is very big on breadth."Recognising the challenges for the channel in terms of education and understanding new demands of its customers, IBM boasts one of the most comprehensive training programs for its channel partners, rooted back in the early 80s, Sherry said.
"Nowhere does IBM rely more strongly on its partners than in the server space," he added.
The most pressing example of this need, according to Gartner's Boon, is the current threat of confusion in the market brought about by the Y2K mentality and fiscal caution due to the GST.
"These market forces have thrown what are normally fairly predictable patterns of shipments into disarray," he said.