Happy birthday, x86! An industry standard turns 30
- 05 June, 2008 08:21
Thirty years ago, on June 8, 1978, Intel introduced its first 16-bit microprocessor, the 8086, with a splashy ad heralding "the dawn of a new era." Overblown? Sure, but also prophetic. While the 8086 was slow to take off, its underlying architecture -- later referred to as x86 -- would become one of technology's most impressive success stories.
"X86" refers to the set of machine language instructions that certain microprocessors from Intel and a few other companies execute. It essentially defines the vocabulary and usage rules for the chip. X86 processors -- from the 8086 through the 80186, 80286, 80386, 80486 and various Pentium models, right down to today's multicore chips and processors for mobile applications -- have over time incorporated a growing x86 instruction set, but each has offered backward compatibility with earlier members of the family.
See interview: Intel's Patrick Gelsinger on the hot seat See story: What's next for the x86?
In the three decades since the introduction of the 8086, the x86 family has systematically progressed from desktop PCs to servers to portable computers to supercomputers. Along the way, it has killed or held at bay a host of competing architectures and chip makers. Even some markets that had seemed locked up by competitors, such as Apple's use of Motorola PowerPCs in the Macintosh computer, have yielded to x86 in recent years.
How did Intel's architecture come to dominate so much of the computing world? Let's take a look.
In the beginning
Intel's first microprocessor was the 4-bit 4004, which was made for a Japanese calculator in 1971. That was quickly followed by the 8-bit 8008 and in 1975 by the 8-bit 8080 chip. The 8080 went into the Altair 8800 PC, which was sold as a mail-order kit. (Bill Gates and Paul Allen founded Microsoft to sell their version of Basic for the Altair 8800.)
Three years later, the 16-bit 8086 made its debut. IBM's selection of the 8088, an 8086 variant, to power its PC in the early '80s gave the x86 architecture tremendous momentum and helped it become an industry standard that persists today.
Patrick Gelsinger, electrical engineer, chip designer and now executive vice president at Intel, says the critical turning point for the PC industry -- the thing that really sent the industry into overdrive -- was the introduction of the 32-bit 80386 in 1985. It was not obvious at the time that the x86 needed to be upgraded from the 16-bit address space of the earlier models, he says. "People said, 'What do you mean 32 bits? That's for minicomputers and mainframes.' They derided us at the time for being extravagant."
At about the same time, Compaq announced a 386-based PC, which lessened IBM's death-grip control of the personal computer market. The IBM PC at the time ran the 16-bit 80286, which was more than three times slower than the 386.
According to Intel, IBM spurned the 386 because there was not yet any 32-bit software to take advantage of it. IBM was also developing a proprietary 16-bit operating system called OS/2.
"IBM owned the architecture from top to bottom. It was their applications, their operating system and their hardware design," says Gelsinger, who was a member of the 386 design team. "When they went to the next generation, they would be the only company able to offer the top-to-bottom solution, with no guarantee of compatibility from one generation to the next."
All that changed with the advent of the 386, Gelsinger says. "We moved from a vertical industry to a horizontal industry, and that really opened up the world."
The 386 was followed by the 486 in 1989. Finding that it couldn't trademark numbers, however, Intel broke from its earlier naming convention in 1993, when it named its fifth-generation processor the Pentium rather than the 586. Numerous generations of chips have carried on the Pentium brand (e.g., Pentium Pro, Pentium II and Pentium D), and Intel has since added the low-end Celeron and the high-end Core 2 brands to its x86 offerings.
Despite the name changes -- not to mention design improvements that led to exponential increases in speed, power and efficiency -- all of these chips are based on the x86 instruction set that began with the 8086 and continues to expand today.
Ingredients in a recipe for success
Why has the x86 been so successful for so long, beating back and in some cases completely vanquishing competing microprocessor architectures? For starters, the x86 came along at just the right time. By 1978, computing had been migrating from huge, expensive mainframes to smaller, cheaper minicomputers for several years. The desktop was the logical next frontier.
Moreover, the x86 demonstrated a property that had been predicted in 1965 by Gordon Moore, who would one day become Intel's chairman and CEO. Moore said, in essence, that microprocessors would double in performance every two years at no increase in cost. His prediction, later dubbed Moore's Law, proved to be correct, and the x86 went on to dominate large swaths of computing, from the data center to the workplaces and homes of end users.
And the 8086 and its successors continued to cement the relationship between two early giants of the desktop computer industry. Bill Gates and Paul Allen had tried but failed to develop their Basic programming language for the wimpy 8008 processor in 1972. But they made it work on the more powerful 8080 that they soldered into the Altair microcomputer in 1975.
That marked the beginning of a de facto partnership between Intel and Microsoft that would create a gargantuan base of software that continues to drive the industry today. Of all the factors that have led to the success of the x86 architecture, probably none is so important as that software inventory -- and no example better demonstrates this fact than the RISC processor scare.
The RISC risk
In the late 1980s and early 1990s, a serious threat to the x86 arose in the form of reduced instruction set computing (RISC) processors such as the Sun Sparc, the IBM/Apple/Motorola PowerPC and the MIPS processors. The idea was that a processor could be made to run blindingly fast if it worked on very simple instructions, with one instruction executed each clock cycle, rather than with the elaborate, multicycle instructions used in complex instruction set computing (CISC) processors like the x86.
Pundits, the press and Intel competitors widely predicted the demise of CISC at the time. "It was a difficult time for us," Gelsinger acknowledges. Indeed, Intel rushed to develop its own RISC workstation processor, the i860. But neither the 860 nor any other RISC processor came close to dislodging the hegemony of the x86.
Here's why, according to Gelsinger, who was the lead architect for the 80486 processor: "The day before the 486 was announced [April 10, 1989], there was already billions of dollars of software waiting to run on the chip. Even though the [x86 CISC] architecture was a little bit slower, by the time you could develop software for the RISC machine, we could make the [x86] machine that much faster. We had an overwhelming economic advantage because we had so much of an installed base and so many people developing. The RISC machine could never catch up."
Ironically, the lack of software for RISC machines -- plus big performance gains on the 80486 and Pentium processors -- doomed Intel's i860 along with other RISC processors. Trying to introduce a second major microprocessor architecture was a mistake, Intel would later admit.
But RISC spurred much innovation, says David Patterson, a computer science professor at the University of California, Berkeley, and one of the key RISC innovators in the 1980s. "The [Digital Equipment Corp.] VAX architecture, for example, could not keep up with RISC, and it more or less disappeared. But Intel was able to incorporate the ideas that were becoming popular in RISC while maintaining their old architecture with its large software base. And they did that in part with their superior manufacturing."
The floating-point fiasco
Perhaps as gut-wrenching as the RISC threat was a crisis that began in the summer of 1994, when Intel test engineers discovered a tiny flaw in the floating-point division circuitry of its new Pentium chip. The flaw occurred so rarely and was so minor in its impact that Intel elected to just fix it and put the chip back into production without recalling the flawed chips.
But a few months later, Thomas Nicely, a math professor at Lynchburg College in Virginia, discovered the flaw in his PC. He was unable, Intel was to admit later, to find anyone at Intel who would even listen to his complaint. So he posted his findings on the Internet, and before long, Intel was engulfed in a firestorm of criticism that would ultimately lead to a public relations disaster and a $475 million recall of the chip.
"It was a painful rite of passage, but we finally learned to behave like a consumer company," recalls Albert Yu, a former Intel senior vice president, in his book, Creating the Digital Future .
Mixing and matching
Another defining moment in x86 history occurred in 1995, says Todd Mowry, a computer science professor at Carnegie Mellon University and an Intel research consultant. That's when Intel introduced the Pentium Pro, a microprocessor with some radical new features, such as the ability to look ahead in a stream of instructions, guess which ones would be needed and then execute them out of order. That kept the processor busy a larger percentage of time and, combined with a new, extremely fast on-chip cache, it offered huge performance gains in some applications.
"The thing that was radically different," Mowry says, "was that they used the benefits of RISC without changing the instruction set. They did that by translating the x86 instructions into micro-operations that are more like RISC instructions. So what you had was a RISC machine inside an x86 machine, and overnight, that eliminated the performance gap."
Mowry says the Pentium Pro resulted from a top-down design process. "They started out with the design of a fast machine and then figured out how to make the x86 run on it," he says.
That approach -- finding good ideas in non-x86 architectures and working backward from them -- was just how it worked, Gelsinger says. "The Pentium was a dramatic architectural leap. We took the best ideas from minis and mainframes and just implemented them better, because we had a superior canvas to paint them on, called silicon."
Unlike a mainframe, which spreads processing components over a wide area inside the box, putting everything on a single, tiny, tightly integrated chip gives microprocessor designers more flexibility and their designs more power, he says. Indeed, over the years, the performance of silicon chips has marched smartly along according to Moore's Law, while systems of interconnected components have not improved as fast.
The competition heats up
Intel has not enjoyed immunity from competition even on its x86 home turf. For example, Taiwan-based VIA Technologies was founded in Silicon Valley in 1987 to sell core logic chip sets, some using x86 technology, for use in motherboards and other electronic components. VIA now makes a wide variety of products and aims its x86 processors at low-power mobile and embedded markets.
Advanced Micro Devices, the world's No. 2 maker of microprocessors, has become a competitive thorn in Intel's side since about 2000. Throughout most of the 1980s and 1990s, AMD had been a me-too maker of x86 chips and was hardly any concern to Intel. (It still has only about 15 per cent of the x86-compatible desktop and mobile market, according to Mercury Research.)
But AMD scored a technical and public relations coup in 2000 with its introduction of x86-64, a 64-bit superset of the x86 instruction set. As a superset, it meant that users of new x86-64 machines could use them to natively run their old 32-bit software.
At the time, Intel's 64-bit offering was Itanium, an architecture developed by Intel and Hewlett-Packard for superscalar execution on big iron, and it was not directly compatible with 32-bit x86-based software. Intel responded to the AMD threat with its own 64-bit x86 instruction superset, the EM64T, in 2004. AMD, and the press, made much of the fact that the company had beaten Intel to the 64-bit market that mattered most.
"It's an example of where the flexibility of the x86 instruction set was used against Intel," says Patterson. "So even though Intel dominates the market, another company can change directions for the x86."
Going to extremes
Today, Intel's x86 is chipping away at the extremes in computing. On April 28, the company announced it would team with Cray to develop new supercomputers based on Intel x86-based processors. (Cray already uses AMD's x86-based 64-bit Opteron processors.)
And at a Shanghai developer conference on April 2, Intel announced the Atom x86-based processor, the company's smallest. It draws less than 2.5 watts of power, compared with about 35 W for a typical laptop processor. The company shipped two new Atom chips for small laptops and desktops just this week.
So can the x86 thrive, or even survive, another 30 years? There are forces in play that will fundamentally transform microprocessor designs, even in the near term. But few are predicting the demise of the venerable x86. Says Carnegie Mellon's Mowry, "It's difficult to see any reason why another instruction set would take over, because there is so much valuable software that runs on [the x86]."
Timeline: A brief history of the x86 microprocessor
Here's a peek at the events and technologies that led to the development of Intel's x86 architecture, plus milestones in its 30-year reign.
1947: The transistor is invented at Bell Labs.
1965: Gordon Moore at Fairchild Semiconductor observes in an article for Electronics magazine that the number of transistors on a semiconductor chip doubles every year. For microprocessors, it will double about every two years for more than three decades.
1968: Moore, Robert Noyce and Andy Grove found Intel to pursue the business of "INTegrated ELectronics."
1969: Intel announces its first product, the world's first metal oxide semiconductor (MOS) static RAM, the 1101. It signals the end of magnetic core memory.
1971: Intel launches the world's first microprocessor, the 4-bit 4004, designed by Federico Faggin.
The 2,000-transistor chip is made for a Japanese calculator, but a farsighted Intel ad calls it "a microprogrammable computer on a chip."
1972: Intel announces the 8-bit 8008 processor. Teenagers Bill Gates and Paul Allen try to develop a programming language for the chip, but it is not powerful enough.
1974: Intel introduces the 8-bit 8080 processor, with 4,500 transistors and 10 times the performance of its predecessor.
1975: The 8080 chip finds its first PC application in the Altair 8800, launching the PC revolution. Gates and Allen succeed in developing the Altair Basic language, which will later become Microsoft Basic, for the 8080.
1976: The x86 architecture suffers a setback when Steve Jobs and Steve Wozniak introduce the Apple II computer using the 8-bit Motorola 6502 processor. PC maker Commodore also uses the Intel competitor's chip.
1978: Intel introduces the 16-bit 8086 microprocessor. It will become an industry standard.
1979: Intel introduces a lower-cost version of the 8086, the 8088, with an 8-bit bus.
1980: Intel introduces the 8087 math co-processor.
1981: IBM picks the Intel 8088 to power its PC. An Intel executive would later call it "the biggest win ever for Intel."
1982: IBM signs Advanced Micro Devices as second source to Intel for 8086 and 8088 microprocessors.
1982: Intel introduces the 16-bit 80286 processor with 134,000 transistors.
1984: IBM develops its second-generation PC, the 80286-based PC-AT. The PC-AT running MS-DOS will become the de facto PC standard for almost 10 years.
1985: Intel exits the dynamic RAM business to focus on microprocessors, and it brings out the 80386 processor, a 32-bit chip with 275,000 transistors and the ability to run multiple programs at once.
1986: Compaq Computer leapfrogs IBM with the introduction of an 80386-based PC.
1987: VIA Technologies is founded to sell x86 core logic chip sets.
1989: The 80486 is launched, with 1.2 million transistors and a built-in math co-processor. Intel predicts the development of multicore processor chips some time after 2000.
Late 1980s: The complex instruction set computing (CISC) architecture of the x86 comes under fire from the rival reduced instruction set computing (RISC) architectures of the Sun Sparc, the IBM/Apple/Motorola PowerPC and the MIPS processors. Intel responds with its own RISC processor, the i860.
1990: Compaq introduces the industry's first PC servers, running the 80486.
1993: The 3.1 million transistor, 66-MHz Pentium processor with superscalar technology is introduced.
1994: AMD and Compaq form an alliance to power Compaq computers with Am486 microprocessors.
1995: The Pentium Pro, a RISC slayer, debuts with radical new features that allow instructions to be anticipated and executed out of order. That, plus an extremely fast on-chip cache and dual independent buses, enable big performance gains in some applications.
1997: Intel launches its 64-bit Epic processor technology. It also introduces the MMX Pentium for digital signal processor applications, including graphics, audio and voice processing.
1998: Intel introduces the low-end Celeron processor.
1999: VIA acquires Cyrix and Centaur Technology, makers of x86 processors and x87 co-processors.
2000: The Pentium 4 debuts with 42 million transistors.
2003: AMD introduces the x86-64, a 64-bit superset of the x86 instruction set.
2004: AMD demonstrates an x86 dual-core processor chip.
2005: Intel ships its first dual-core processor chip.
2005: Apple announces it will transition its Macintosh computers from PowerPCs made by Freescale (formerly Motorola) and IBM to Intel's x86 family of processors.
2005: AMD files antitrust litigation charging that Intel abuses "monopoly" to exclude and limit competition. (The case is still pending in 2008.)
2006: Dell announces it will offer AMD processor-based systems.