ARN

Taking a look in the rear view mirror

The Internet goes commercial

The Internet scene in the mid-1980s was dominated by discussions of acceptable-use policies, through which government and academic users sought to restrict Internet access to, well, government and academic users. Unacceptable uses of the Internet, such as porn and spam, hadn't been thought of yet; in those days, unacceptable meant commercial. Today, billions of dollars in transactions flow through the Net every month. E-commerce became an obsession when the dotcom bubble started to inflate in 1997. Even after the bubble popped in 2000, however, corporate enthusiasm for the Internet hardly slowed. Today, some of the hottest ideas in the industry - Web services, VoIP, service-oriented architectures and utility computing - are grounded in the Internet.

Monopoly musical chairs

IBM dominated computing until the late 1980s. But its 1981 release of the IBM PC and the acceptance of PC clones, which were packed with Microsoft software, created a desktop computing market that changed the face of IT and put Microsoft at the centre of power in the industry. Microsoft now faces threats from Linux, Google and Europe's antitrust regulators.

The Y2K 'problem'

The first printed mention of a Y2K Armageddon was made in IDG's Computerworld in 1984. In 1993, we printed Peter de Jager's estimate that Y2K repairs would cost $US100 billion. As hysteria mounted, cost estimates soared to close to $US1 trillion. On January 2, 2000, the whole thing was seen as a bad dream and promptly forgotten. IBM said the average large company spent up to 400 man-years on the problem.

The new face of outsourcing

The practice of IT outsourcing stretches back to 1949 with ADP's mission to be the payroll service for the world. In 1962, Ross Perot started EDS as a general purpose IT outsourcing shop. And when Lou Gerstner took over IBM in 1993, his turnaround strategy was largely based on pushing outsourcing services. But outsourcing became a contentious labour and political issue early this century when multinational corporations stepped up sending IT work offshore during an economic downturn. India's offshoring revenues in fiscal 2005 skyrocketed 34.5 per cent to $US17.2 billion, with more than a million Indian IT workers serving overseas customers.

The rise of PCs

Personal computing is just one dimension of the epochal movement of computing away from centralised mainframes to client/server computing, multi-tier distributed computing, grids and more. But unlike the emergence of the minicomputer and the server, the rise of the PC had special meaning for IT managers: It meant they were no longer in control. That Lotus 1-2-3 spreadsheet user was programming, whether the IT shop liked it or not.

Page Break

Open-source in data centres

The emergence of open-source software predates Linux. Sendmail, originally written in 1983 for Unix, was the first open-source program to be widely adopted by IT departments. Today, it transfers about 70 per cent of the Internet's email. In 1996, the open-source Apache project had become the most popular Web server software on the Internet; by last month, Apache was running more than 80 million sites, for a 62 per cent market share. However, with IBM backing Linux - followed by HP, Oracle and others - it has become the face of open-source. Today, the operating system's share of the server market is close to 30 per cent.

Security: from nuisance to all-out assault

In 1988, Robert Morris' worm crippled 6000 machines on the Internet. Today's Web-tethered companies have built defences that would have stopped the primitive Morris worm in its tracks.

But incursions now come from organised criminals using sophisticated tools to steal information for blackmail, corporate espionage or identity-theft schemes. Last year, the FBI reported that 95 per cent of companies it surveyed had seen network perimeters battered by online criminals.

Client/Server computing

One of the biggest IT topics of the 1980s and 1990s has followed a familiar path for technology: from hype, to disillusionment, to maturity and decline. Early client/server rollouts were costly and unreliable. The mainframe wasn't dead, after all. But standards and tools evolved, as did skills, and companies began to reap the advantages of client/server - scalability, flexibility and ease of application development. More recently, developers have discovered that still more tiers can bring better performance, flexibility and scalability. Applications can be broken into presentation, business logic, data access and data storage layers, each residing where it works best. Stir in Web-based clients and Web services, and those advantages are magnified. Apparently, client/server was just a stepping stone.

Elusive software quality

Since the 1980s, a profusion of development methodologies have gained and lost favour. The results have been mixed. In 2004, the Standish Group estimated that 18 per cent of IT development projects were cancelled. However, 'challenged projects', ones that failed in part, held steady at 53 per cent over the decade. Even the vaunted peer-group quality-control system of open-source can produce bad code. Research funded by the US Department of Homeland Security found that Linux 2.6 had more than 68,000 lines of flawed code.

The geeks go to business school

First you were manager of data processing, then director of MIS, then CIO. Over the years, the focus of IT managers shifted from running payroll systems to automating management information to optimising business processes. As the CIO emerged in the 1980s, IT managers were advised to align IT with the business, think strategically and get a seat at the business (or boardroom) table. Many CIOs have done that. But some pundits predict IT managers as we know them today will eventually disappear, subsumed into the ranks of general business management.