In less than one short year, we'll be celebrating the big 2-0-0-0 (perhaps in the dark). That means 1999 should be a dull year by comparison. It probably won't be, but it should be. This is the year when most IT executives should concern themselves with things such as year 2000 issues and reducing total cost of ownership at the desktop - and nothing else.
As I assemble my predictions for the next year or two, I'm afraid we'll instead get distracted by advancements in technological doodads and gizmos. The Internet is driving most industry advancements lately, but not every step forward is confined to communications-related technology.
The Internet will continue to drive the progress of loosely related technologies such as audio- and video-compression techniques, hardware-accelerated speech recognition, and hardware-accelerated 3D graphics rendering. I know it sounds odd, but I recommend keeping a watchful eye on this last category, particularly on the company 3Dfx Interactive. This company makes the most popular 3D hardware-accelerator chip sets, the latest of which is the Voodoo2.
I bought a Creative Labs 3D Blaster Voodoo2 board for the family PC just before Christmas. Shortly thereafter, I watched in awe as my son played Sierra's Half-Life, the latest 3D action shooter game. I was so blown away by the quality of the 3D gaming experience that I bought two more 3D action games for him and a Diamond Monster 3D II card for me. (The Creative and Diamond cards are essentially the same. Both are based on the 3Dfx Voodoo2 chip set.)Now, I could go on at length as to why it should be obvious that advanced 3D hardware acceleration is on the verge of invading IT. Just look at how many of the latest mainstream cards from Matrox, ATI, Diamond, and Number Nine include advanced 3D rendering features.
But here's what really tipped me off: I despise 3D shooter games. I hated Doom. I hated Quake. And I really shouldn't waste time playing these games even if I loved them. But the latest wave of software and hardware makes the 3D experience so attractive that I can't resist jumping into an Internet game of Half-Life occasionally in order to have my son blast me to smithereens every 30 seconds.
This tells me that one of two things has finally happened. Either my brain has fried to a crisp, or software rendering and hardware-accelerated 3D have advanced to the point where an imaginative company can figure out a way to make you and your users want it.
Perhaps both developments are true, because personally, I can't imagine what kind of mainstream IT productivity application would require 3D hardware acceleration. But I know someone else can. And mark my words - someone will. I predict that a clever company will exploit 3D hardware acceleration in a productivity application within one to two years. In two to three years most of us will take it for granted that the latest desktop machines will include the hottest 3D hardware, most likely on the motherboard.
Expect hardware-assisted speech-recognition technologies to emerge over the next year or two as well. As wonderful as the current round of speech-recognition products may be, they require far too much CPU power and RAM in order to provide acceptable results.
The funny thing about the hardware-speech category is that not only is the technology anything but new, but it also seemed to disappear just as speech recognition began coming into its own. IBM used to make a speech-recognition card but dropped it in favour of standard sound cards because the dedicated hardware was just too expensive. Now that speech recognition is going mainstream, look for someone to pick up where IBM left off.
All of this is good for Intel, of course, because you still need a fast CPU to make the most of dedicated hardware-acceleration cards. (Try running a Voodoo2 card on a Pentium 133 compared to a 333MHz Pentium II to see for yourself.) Add to this the increasing popularity of Linux, and Intel executives can put away the bottles of Ripple and break out the good champagne for the next couple of years.