A high-tech pioneer reflects on the digital revolution
- 21 November, 2011 22:09
The man who led the design and development of the first microprocessor 40 years ago said that, at the time, he didn't foresee the extent of the digital revolution he was helping to create.
"I've been surprised to see the impact the microprocessor has had on society, particularly on work," said its designer, Frederico Faggin.
Today, Frederico Faggin envisions a future where quantum and cognitive computing are widely used, but he said there's no way to foresee the amazing changes to the way we work and live that technological evolution will bring in another 40 years.
"At the time, I clearly understood that it was a revolutionary idea," said Faggin, who designed the microprocessor and led the first microprocessor development project at Intel in 1970. "However, I did not imagine how many new applications it would open up.... I've been surprised to see the impact the microprocessor has had on society, particularly on work. When you walk through an office, you'll see a computer on every desk. You have a very powerful computer literally at your fingertips with your cellphone. We have been affected in ways I certainly did not imagine."
The Intel 4004
Born in Italy in 1941, Faggin (pronounced Fa-jeen) worked at Fairchild Semiconductor in the late 1960s, leading the invention of silicon gate technology and designing the world's first commercial integrated circuit to use that technology. In 1970, he went to work at Intel where he led the work on the Intel 4004, the first commercially available microprocessor, which helped to kick-start the digital revolution. The 40th anniversary of the release of the Intel 4004 was last week.
In 1996, Faggin was inducted into the National Inventor's Hall of Fame. In 1997 he was awarded the Kyoto Prize, and in 2009, he was a recipient of the National Medal for Technology and Innovation from President Barak Obama.
Today, he is chairman emeritus of Synaptics Inc., a company that develops user interface products, which he co-founded in 1986. He also studies consciousness, which he says is the "the last frontier" in understanding information processing.
But even with all of his awards and achievements, Faggin says he's still amazed by the evolution that the microprocessor began.
"I imagined that people would have a computer to do computation at their desks, particularly engineers so they could do their work more effectively than on a slide rule," he said in an interview with Computerworld. "But a personal computer which over the years does more than computation and word processing? A personal computer where you have a multimedia type of environment and a window to connect to the Web? I did not imagine that."
"The way it has transformed our lives was just unthinkable 40 years ago," he added. "What does it mean when you have 2 billion or 3 billion people interconnected? What does it create? That I certainly could not imagine."
However, Faggin said possibly the most important development to stem from the creation of the microprocessor is the smartphone .
"The personal computer for sure, but the cellular telephone perhaps even more importantly," he said. "There's an incredible penetration of the market. People who don't even know how to read and write know how to use a cellphone. You need a high level of intelligence inside the telephone to switch frequencies and allow mass utilization of the frequencies. You could not have a cellphone without a computer inside."
Faggin also is intrigued with the concept of e-books; he said that technology will revolutionize how we learn in the future.
"The e-book, which we are just beginning to see the early stages of, will bring a fundamental transformation of society," he noted. "Ten or 20 years from now, books will be interactive, multimedia types of things. In school, students will have a much higher level of learning through a well-designed e-book than was ever thought possible."
Do you like the look and feel of a traditional paper book? That's too bad, because Faggin says that in 30 years traditional books will be completely forgotten.
"For older people like me who like to leaf through a book and like the smell of the paper, it will be a thing of the past," he added. "My grandchildren will simply never consider a regular book anything more than something that was part of the past. People in 30 years will look at a regular book as nothing more than a curiosity."
So what does Faggin think will be the next big milestones in computing? It's hard to predict the way technology will progress, he said, but noted that he expects to see the development of both cognitive and quantum computing. And those technologies should bring dramatic changes to research, mathematics and science, he added.
"The real revolution will be to figure out a way to create cognitive computing -- computers that operate using a way of information processing that is similar to the brain," he said. "We are just in the very early stages of that so it's hard to predict when that will happen."
He said the next big stage of computing will be the development of quantum computing, which would have computers use the quantum physics properties of subatomic particles to do problem-solving now performed with transistors on a chip. "The potential for quantum computers would be that they are able to solve something that we think is unsolvable," he said.
Despite his many years in the industry, Faggin said he remains excited about the technological changes that will be coming. "In 40 years, we'll be so far away from where we are today that it's actually unthinkable," Faggin said. "In 40 years ... conventional computers that will be about 100 million times more powerful and more complex than they are today.... We just can't see how computers will change our lives."
Sharon Gaudin covers the Internet and Web 2.0, emerging technologies, and desktop and laptop chips for Computerworld. Follow Sharon on Twitter at @sgaudin , or subscribe to Sharon's RSS feed . Her email address is firstname.lastname@example.org .
Read more about it leadership in Computerworld's IT Leadership Topic Center.