Tech's all-time top 25 flops
- 22 January, 2008 11:20
Imagine how different the tech industry might have been had Gary Kildall accepted IBM's offer, back in 1980, to license his computer operating system for a top-secret project. CP/M would have been the OS that shipped with the original IBM PC, and the world might never have heard the name of Kildall's competitor, who eventually accepted the contract: a Mr. Bill Gates.
For all the amazing advances that the computing industry has brought us over the years, some of its most pivotal moments are memorable for all the wrong reasons. Not every idea can be a winner, and not even Microsoft can avoid every misstep. But as they say, those who forget history are doomed to repeat it -- and then again, others just keep screwing up. In the interest of schadenfreude, then, here is a look back at the last 20 years' worth of blunders, fumbles, also-rans, and downright disasters that you may have forgotten about -- or wish you could.
25. IBM PS/2. The original IBM PC hit the market like lightning in 1981. Unlike earlier IBM computers, it was built with off-the-shelf parts instead of proprietary components, making it the most affordable business machine yet. But by the late 1980s, IBM found itself edged out of the market by Compaq and the other PC clone makers. Its solution? Try again with proprietary components, of course!
The Personal System/2 series, introduced in 1987, was meant to be "software compatible" with the PC, but its Micro Channel Architecture made it incompatible with existing hardware. The clones had no such problem. Like the disastrous PCjr before it and the PS/1 series to follow, the PS/2 convinced customers that lightning would never strike twice in IBM's PC division.
24. Virtual reality. In 1982, the movie "Tron" imagined a man traveling the eerie internal landscapes of a computer. Fifteen years later, the technology arrived to make it happen -- sort of.
Building a spatial interface for the Internet was all the rage in the late 1990s, owing in part to VRML (Virtual Reality Markup Language). The problem was, it didn't make much sense. The Web put the world of information at your fingertips; leave it to software engineers to find a way to send it back down the street, across a bridge, and up two flights of stairs.
The concept lives on today in Second Life, which seems to think the problem is not enough advertising. But the truth is that mainstream users have never warmed to VR. Wake us up when we can ride real lightcycles to work and meet our clients on the Game Grid.
23. Compression wars. What do you do when another software company copies your code and releases an improved version of your own product? Sue them into the ground, right? That's what System Enhancement Associates (SEA) thought in the 1980s, when Phil Katz released a clone of SEA's archive compression program, Arc.
Katz's hand-optimized assembly language provided better performance than the original Arc, but because Katz had borrowed code from SEA's product, SEA successfully sued for copyright infringement. Customers, however, felt betrayed. They saw SEA as a bully trying to stifle Katz's superior software. When Katz came up with his own high-performance archive format in 1989 -- called Zip -- they ditched Arc in droves, and SEA's business never recovered.
22. Apple OpenDoc. Long before the Cocoa and Carbon APIs earned raves from Mac OS X application developers, Apple put its weight behind another innovative programming technology. Called OpenDoc, it was a way for developers to build applications out of lightweight, modular components. After all, what is a word processor but a text editor, a spell checker, a file manager, and a few other modules all thrown together? With OpenDoc, developers could mix and match, building their applications out of all the best bits.
Unfortunately, the concept never caught on. As it turned out, most applications weren't really as modular under the hood as they appeared on the surface -- and it didn't help that those so-called lightweight components turned out to be memory hogs that ran like molasses. After five short years, the book on OpenDoc was closed.
21. Push technology. In 1992, PointCast had a clever idea: Why not make it possible to view stock quotes, headlines, and other information in real time, without browsing the Web? Instead, the PointCast client would "push" the information direct to the desktop, all day long.
The idea spawned a horde of imitators. Unfortunately, no one foresaw the strain that all that pushing would place on the limited Internet connections of the time. Network managers banned the client, and modem-based home users balked at the ads being pushed to them along with their sports scores.
News once offered US$450 million for PointCast. Two years later, the push craze had evaporated, and it sold for a paltry US$10 million.
20. Copland. Some fumbles can be recovered. And it's true; today, Mac OS X is an impressive operating system. But imagine how much further Apple could have gone if it had delivered its next-gen OS when it originally intended to, back in 1995.
Copland was meant to be the modern successor to the original Mac OS, but years of political infighting had hobbled Apple's development department. For all its superior engineering talent, it became clear that it was impossible for Apple to produce a modernized Mac OS on its own. Instead, it would buy Steve Jobs' NeXT OS and use that as the basis for the Mac OS X that ultimately shipped in 1999 -- ironic, considering that Jobs had left Apple over political infighting a decade earlier.
19. Gnu Hurd. When Richard Stallman launched the Gnu project in 1983, his goal was to build the world's first completely free operating system: kernel, tools, utilities, applications, documentation, and all. Good thing he didn't start from the bottom up.
Almost 25 years later, there is still no Gnu kernel. The Hurd, as the proposed kernel is known, should have been the Free Software movement's crowning achievement. Instead it's become the poster child for collaborative software development gone wrong, topping the lists of vaporware year after year. And it's a shame -- because wouldn't it be great if there was a free OS kernel for everyone to use?
18. Oracle Raw Iron. What's the best OS for your database server? Should you run it on Windows? Linux? AIX? Something else? Back in 1998, Oracle's answer was none of the above! Instead, Larry Ellison promised an "appliance" version of Oracle 8i, called Raw Iron, that ran atop the bare server hardware. No longer would Oracle customers need to worry about a separate support contract with an OS vendor: Oracle would handle the whole show.
Behind the scenes, prototype Raw Iron boxes ran a custom version of Sun Solaris, but it didn't matter. Customers had seen through Larry's hand-waving, anyway. When nobody bit, the project was quietly shelved -- just a few years before the market for network appliances took off.
17. B-to-b e-commerce. As the dot-com craze waned in the early 2000s, venture capitalists clung to a last-ditch idea: If all those startup e-commerce companies weren't striking gold with the consumer public, maybe they could ply their wares to other, more established companies instead? They called it b-to-b e-commerce, and a generation of would-be digital disintermediators was born.
The problem was that few of their potential customers were interested in cutting out the middlemen -- not if it meant trading them for an unproven online startup with a tiny sales force and no real experience in inventory management. In the end, though, the b-to-b players did deliver some excellent deals -- when their assets were offered up at auction.
16. Apple Newton. It's no iPhone, but by some measures the Newton still beats the pants off any PDA since. Rabid fans wax nostalgic about the Newton OS, and breathy rumors of a new Apple PDA remain a staple of Macworld Expos. Alas, the Newton never had a chance. Introduced in 1993, the Newton MessagePads were bulky, with lousy battery life. While Palm and Microsoft's PDA partners were building devices that could actually fit in your pocket, Apple answered with a full-sized keyboard and a clunky clamshell for the Newton eMate 300 in 1997, then threw in the towel as its losses mounted.
It's a shame. With some software tweaks to suit business users, the iPhone and the iPod Touch could get Apple back in the game. But given the bad taste left by Newton, who'd be brave enough to suggest it to Steve now? (Besides us.)
15. Palm OS Cobalt. The only thing worse than giving up on a viable market is to win it totally, then run it into the ground. What else can be said about Palm? Its devices were revolutionary, its competition from Microsoft laughable. Then came years of mergers, acquisitions, spin-offs, and rebranding. All the while, the products stagnated. When Palm finally delivered its Cobalt OS in 2004, the purported successor to the aging Palm OS failed to win any licensees.
Now, Palm is resorting to a Hail Mary: The next Palm OS, we're told, will be based on Linux. But when it might appear is anyone's guess, and by then it won't matter. We'll all be too used to this year's Treos, running Windows Mobile.
14. Netscape 6. The turning point of the browser wars came in 1997, with the release of Internet Explorer 4. For the first time, IE was better than Netscape Communicator. Not only was it faster, but it had more features and better standards compliance.
Netscape should have struck back immediately, but instead it dragged its feet. As Microsoft pressed ahead with IE5, Netscape's open source Mozilla project foundered, producing nothing but buggy "preview releases." When Netscape 6 finally appeared, years later, it was a bloated, sluggish mess. The war was lost, and now even the Netscape brand is scheduled to pass into oblivion.
Ironically, the former Netscape Communicator suite lives on as an open source project, called SeaMonkey -- presumably because it sounded great at first, but the real thing is a disappointment.
13. Search portals. Where are they now? At the height of the dot-com boom, Web surfers had a plethora of search engines to choose from: AltaVista, Excite, InfoSeek, Lycos, and many more. Today, the major players of the past are mostly dead. A few have soldiered on, such as Ask.com, but only after repeated redesigns.
Chalk it up to old-fashioned hubris. Instead of concentrating on their search offerings, the first-generation search engines fell victim to the portal arms race. They built up dashboards full of sports scores, stock quotes, news headlines, horoscopes, the weather, email, instant messaging, games, and sponsored content -- until finding what you wanted was like playing Where's Waldo. Neither fish nor fowl, they became awkward combinations of search portals and general-interest portals. The world went to Yahoo for the latter. And when an upstart called Google appeared with a clean UI and high-quality search, users told the other engines to get lost.
12. IPv6. Few topics spark more debate than IT's equivalent of global warming. According to some experts, the question isn't whether we will run out of IPv4 network addresses, but how soon. And there's no Kyoto controversy here; federal policy already requires that government offices transition to IPv6 by 2008. So why is everyone still dragging their feet?
Quite simply, IPv6 is a fix for a problem nobody has yet. Stopgap solutions such as NAT, while infuriating to network engineers, have proven effective. And IPv6 offers no compelling features to offset the headache of implementing it. In other words, until someone offers the equivalent of carbon credits for networking, IPv6 is one truth that's just too inconvenient.
11. Microsoft Passport. We all have too many online accounts and too many passwords to go with them. If Microsoft wants to save us some hassle by remembering them all for us, why not, right?
But Web users didn't flock to Microsoft's single sign-on service when it went live, and neither did partners. The idea was good, but the implementation was lousy. First came public debate over privacy concerns. The discovery of security holes in the Passport software was just the final straw.
To be fair, Microsoft's competitors had no luck with the concept, either. As it turns out, people think handing over their personal security information to a third party as a way to save effort is a bad idea. Who knew?
10. Itanium. For almost as long as its chips have powered PCs, Intel has had to compete with clones. So when Intel offered a high-powered 64-bit chip for server computing, it almost made sense that it should be a completely different design than the classic x86. Almost, that is, except for one thing: The reason Intel's rivals released clones was because people wanted x86s.
Correction: They wanted affordable x86, and Itanium was neither. But although it was a commercial flop, the Good Ship Itanic sails on, though to where we cannot say. It takes a brave captain, indeed, to follow into those waters.
9. Mac clones. Nobody dangled a carrot in front of its customers like Apple did in the late 1990s. For years, the Mac faithful had clamored for alternatives to Apple's expensive, proprietary hardware. Apple gave its blessing in 1995, and the market flooded with discount Mac clones from the likes of Power Computing, Motorola, and Umax. Then, just two years later, they were gone.
Fans of the clones fumed, but for Apple the only thing more damaging than this bait-and-switch was greenlighting the clones to begin with. Clone makers used generic PC components, an affront to the Mac's carefully cultivated image, but their popularity underscored how mediocre Apple's own offerings had become. As it turned out, what Apple really needed wasn't alternative manufacturers, but alternative management.
8. Electronic currency. If you're looking for the perfect complement to e-commerce, what else but e-money? If storefronts can go virtual, then why shouldn't customers' wallets?
The dot-com era spawned a cottage industry of alternative currencies and "loyalty programs." Many positioned themselves as solutions for micropayments -- the idea that online content providers could actually collect on the infinitesimally small fees that the market would bear. As it turned out, however, while employees of Internet startups might have been content to be paid in paper stock options, consumers preferred cold, hard cash -- proving once and for all that the only thing worse than a wooden nickel is a digital one.
7. The 64-bit desktop PC. We're all used to the major chipmakers wowing us with numbers, but as the gigahertz race slowed, life began imitating " Spinal Tap." How many bits does your processor support? 16? 32? Mine goes all the way up to 64!
Apple and IBM may have started the 64-bit craze in 2002, but AMD and Intel quickly followed suit. The elephant in the room, however, was that nobody could think of a serious reason why the average PC user might need to be able to access 16 exabytes of RAM.
64-bit OSes still lag behind the 32-bit versions and the applications aren't there, but who cares? After all, it's not how many bits you've got -- it's the cores that matter. Or something.
6. Carly Fiorina. Call her the anti-Steve Jobs. During her 1999-2005 tenure as CEO of Hewlett-Packard, Carly Fiorina proved that she could reverse decades of geek goodwill and alienate customers like no one else. She oversaw the spin-off of HP's well-respected instruments and medical equipment business, outsourced its beloved calculator division, then issued 7,000 pink slips. Under Fiorina's tenure, HP brought in more profits from printer ink than PCs. But she'll be remembered most for HP's acquisition of Compaq, among other dubious efforts to give the "stodgy" HP a more consumer-friendly face (does anyone remember the licensed iPods?).
HP's stock price sagged under Fiorina, but she still walked away with a US$21 million severance bonus. Not bad, considering that HP began in a garage with just US$538 in capital.
5. Digital rights management. Has any industry ever invested so heavily in a technology that its customers didn't want? Again and again, media companies keep racing to market with one new twist on digital copy protection after another, then offer up their own failures as "proof" that the market isn't ready for downloadable content.
It's strange, too, when you consider that every DRM scheme yet invented has been quickly compromised. Some experts believe that the whole concept of DRM technology is inherently flawed. Still, as encouraging as Steve Jobs' recent statements have been, it's difficult to speculate on when the media giants will give up on this harebrained idea. Greed is a powerful motivator.
4. Paperless office. Who needs paper? For years, office managers have been rubbing their hands at the notion that databases, spreadsheets, e-mail, and digital documents can replace traditional ledgers, forms, and faxes. But while saving the environment is a laudable goal, the truth is that it's actually harder to preserve digital records than the old-fashioned kind.
Increased regulatory pressures have helped fuel a cottage industry around information lifecycle management, but the real-life capabilities of these products are a far cry from their lofty ambitions. Meanwhile, digital record-keeping presents security and privacy risks unheard of in the days of locked filing cabinets. The paperless office may finally be here, but it's enough to make you wish for a good, old-fashioned memo pad.
3. iPod imitators. Apple has always been the underdog of the PC market. Maybe that explains the electronics industry's chronic habit of underestimating the iPod. Would-be competitors have come and gone, but go they do, swiftly, once customers get a gander at their second-rate hardware and atrocious interfaces. From Microsoft's ugly, feature-hobbled Zune to TrekStor's abortive plan to release a player under the cringe-inducing moniker i.Beat blaxx, it seems nobody can get it right, even with Apple paving the way.
Frustrated, SanDisk recently resorted to Swift Boat-style attack ads, referring to the Apple faithful as "iChimps." The question is, is it the iPod they hate? Or the millions of potential customers who refuse to settle for an inferior copy?
2. Windows Vista. What if you threw a party for the world's most revolutionary operating system and nobody came? Then again, by the time Windows Vista actually shipped, the "revolution" looked more like a failed coup.
Despite repeated delays, much-anticipated features such as WinFS and the Monad command shell never made it into the shipping version of Vista, even as its system requirements grew and grew. When the dust finally cleared, the new OS seemed like little more than a bloated rehash of Windows XP, touched up with fresh coat of 3D-rendered paint. Add sluggish performance, spotty driver support, UI annoyances, and a dubious application security model, and suddenly desktop Linux doesn't sound like such a crazy idea after all. Who knows what could convince risk-averse enterprises to make the leap to Vista now -- but hey, there's always Service Pack 1, right? Or maybe Service Pack 2.
1. Security. Computers influence every aspect of our business lives. We trust them implicitly to manage our records, compute our figures, and facilitate our communications. When will we ever learn?
Thirty years into the personal computer era, and it seems like security is only getting worse. Computer viruses and worms, though simplistic in comparison to any useful application, have proven as resilient as the common cold. The Web, e-mail, and instant messaging have given criminals unprecedented opportunities for fraud, scams, and electronic spying.
In 2007, corporations lost customer data to cyberthieves like never before. And today's vast digital repositories make for very juicy targets that can be copied onto a few DVDs slipped unnoticed in a jacket pocket. If auto manufacturers approached safety the same way that software makers handle security, we'd all be driving Ford Pintos and Yugos. And airline security would resemble the "systems" that buses use to catch fare-dodgers.
Now that we've built a digital world on an insecure foundation, the solutions for security are really hard -- maybe too hard. We may just need to live with the fact that computer technology is largely unsecure, so caveat utilisator.