In his famous paper published in April 1965, in the journal Electronics, Gordon Moore wrote: "Integrated circuits will lead to such wonders as home computers -- or at least terminals connected to a central computer -- automatic controls for automobiles, and personal portable communications equipment." Analyzing the future of the industry, he predicted that, "reduced costs is one of the big attractions of integrated electronics, and the cost advantage continues to increase as the technology evolves toward the production of larger and larger circuit functions on a single semiconductor substrate. For simple circuits, the cost per component is nearly inversely proportional to the number of components." This became known as Moore's Law. Forth-two years later, it is still valid. But will it be the same in, say, 10 years from now? Justin Rattner, Intel's chief technology officer, answers that question in this interview.
Stories by Peter Moon
Steve Wozniak isn't perhaps as well known as his Apple cofounder Steve Jobs, but "Woz" invented the Apple I in 1976 and the Apple II in 1977, which was one of the best-selling PCs of that time. In this interview, Wozniak, who turns 57 on Aug. 11, talks about how he met Jobs, his most cherished inventions and why he believes thinking robots and artificial intelligence will never happen.
Vernor Vinge, 62, is a pioneer in artificial intelligence, who in a recent interview warned about the risks and opportunities that an electronic super-intelligence would offer to mankind.