Scaling up by scaling down
Future of IT: Nanotechnology promises a breakthrough in computer chips
Nanotechnology ' the control of matter at the atomic or molecular level ' offers the potential to revolutionize computing by allowing the development of more dense processors and memory chips.
- By John Savage
- Dec 07, 2007
At Brown University, computer science professor John Savage has been working to evaluate nanotechnologies as they relate to computing.
How soon might consumers see the impact of nanotechnologies? Savage told GCN he hopes to see initial products within the next year, and he projects that the technologies will really prove themselves valuable in the next 10 years.
And they might offer one benefit in increasing demand: a way to reduce energy consumption.
NEW TECHNOLOGIES NEED some applications that allow them to be perfected. You need some kind of applications for nanotechnology to drive it forward and increase its credibility.
I'm hoping that nanoscale materials ' wires and technologies to make switches ' will emerge and be adopted by the industries beginning in the next year.
When you build a computer processing chip today, the wires are defined by the highest wavelength of electromagnetic radiation ' light.
The highest frequency ' the smallest wavelength ' defines how wide an opening you can cut into a piece of metal and shine a light through it to define a clean line, a wire.
And those wires are going to be at least a factor of 10 wider than the nanowires that we can put down. To imagine a factor of 10, just take a ruler and look at one inch versus 10 inches. A factor of 10 is huge.
You can use nanowires to compute and store data, but the goal is to increase the density of the bits that you can put on a chip.
There are people who have shown that with nanotechnology they can put a number of bits on a chip approaching 1011 or 1012 per centimeter squared.
This is a high density. That's a couple of orders of magnitude higher than what we have today in complementary metal-oxide semiconductors.
Secondly, the technology could reduce power consumption, which would be a big win. Some of the technologies that have been proposed are nonvolatile memories with no power consumption when they're sitting idle.
From the user point of view, this would allow you to do an extremely fast power-up of the computer.
To make it practical, you need to be able to use a small number of lithographic wires to control the nanowires. A number of methods have been introduced for just this purpose.
My students and I have analyzed them all.
What is interesting is that every method we know about so far introduces randomness.
Basically, at the nanoscale, you can't position things where you'd like to position them. You have to go through a discovery process and then go through a configuration step so that you can map the external binary addresses to the internal ones.
That is an issue with people using lithography because their goal in designing chips is to make sure those chips work coming out of the chute.
The semiconductor industry is reluctant to change. They've got so much invested in photolithography that they make changes slowly.
I'm not particularly optimistic that they will move unless they face a genuine crisis.
Nanotechnology is a domain in which the chemist comes forward.
Photolithography has been the domain of the physicist.
That's a gross exaggeration, but with nanotechnology, you can lay down lots of small wires in parallel on a chip easily using old-fashioned techniques that have been in the lab for 15 to 20 years.
You can work at the nano scale and make working devices using standard equipment. It's not cheap but it's not grossly expensive either.
If you move to nanotechnology you might not have to increase the cost by billions of dollars every time you start a new feature line. So we can expect costs to come down.
I would say that within 10 years we'll know if nanotechnology is going to make it. ■