Showing posts with label computer. Show all posts
Showing posts with label computer. Show all posts

Wednesday, March 2, 2016

Researchers have created a model biological supercomputer that is both sustainable and highly energy efficient



The model bio-supercomputer is powered by adenosine triphosphate (ATP), the substance that provides energy to all of the cells in a human body. The model is able to process information extremely quickly and accurately using parallel networks, in the same fashion that electronic supercomputers are able to process information.

However, the bio-supercomputer developed by the project team is much smaller and more energy efficient than the current generation of electronic supercomputers, being only the size of a standard-sized book.

The model bio-supercomputer was created with a combination of geometrical modelling and engineering expertise on the nano-scale. Importantly, it is the first step in showing that a biological supercomputer could realistically work in practice.

Small, portable and energy efficient

The circuit created by the researchers is around 1.5 cm square and instead of electrons being propelled by an electrical charge, as is the case with a traditional microchip, short strings of proteins (called ‘biological agents’ by the project team) travel around the circuit in a controlled way. These movements are powered by ATP, a biochemical that enables internal energy transfer among cells.

Traditional supercomputers use a large amount of electricity and thus heat up to such high temperatures that they need to be physically cooled in order to function effectively. To do this, many supercomputers often require their own dedicated power plant.

In contrast, due to being run by biological agents, the bio-supercomputer hardly heats up at all and is consequently much more sustainable and cost-effective. As the technology is developed further over the coming years and possible routes to larger-scale commercialisation are considered, this could become a major selling-point.

Calculating answers to major societal issues

Although the model bio-supercomputer has successfully and efficiently tackled a complex mathematical problem by using parallel computing in the same fashion as traditional supercomputers, the project team recognises that there is still a long way to go between the model and the development of a full-scale functional bio-supercomputer.

It is hoped that an eventual shift to bio-supercomputers will provide solutions to the growing problem of traditional supercomputers being increasingly unable to quickly calculate answers to some of society’s most pressing issues, such as the development of new drugs and ensuring that engineering systems work as they are supposed to. For these problems, computers have to simply go through all of the possible guesses before reaching the correct answer. This means that if the problem size increases even modestly, the computer can no longer solve it quickly enough to be useful.

Next steps: From science fiction to science

The project team has already begun to explore other avenues on how to push their research even further, and hope that other scientists will be encouraged to also construct new models using alternative biological materials.

The eventual goal would be to perfect the design for a new generation of smaller, more portable and more energy efficient bio-supercomputers that can fully replace traditional supercomputers.

Although the research team believes that it will still take some time for this to become a reality, a potential mid-term solution would be to produce a hybrid design, mixing traditional and biological technologies.
 

ABACUS Project

Wednesday, January 20, 2016

Quantum computing is coming – are you prepared for it?



Quantum computing will change lives, society and the economy and a working system is expected to be developed by 2020 according to a leading figure in the world of quantum computing, who will talk tomorrow [21 January 2016] at the World Economic Forum (WEF) in Davos, Switzerland.

Professor O’Brien, Director of the Centre for Quantum Photonics at the University of Bristol and Visiting Fellow at Stanford University, is part of a European Research Council (ERC) Ideas Lab delegation who have been invited to talk at the annual meeting to industrial and political leaders of the world, including Prime Minister David Cameron.  The session will discuss the future of computing and how new fields of computer sciences are paving the way for the next digital revolution.

Quantum computing has the capability to unlock answers to some of humanity’s most pressing questions that are presently unsolvable with current computing technologies.  In 2014, the UK government invested over £270 million in the development of quantum technologies, ensuring that the UK becomes the epicentre of a technology revolution and Professor O’Brien has been leading the development of quantum computing using light in its quantum state – the photon- as the key ingredient. 

Professor O’Brien said: “In less than ten years quantum computers will begin to outperform everyday computers, leading to breakthroughs in artificial intelligence, the discovery of new pharmaceuticals and beyond.

“The very fast computing power given by quantum computers has the potential to disrupt traditional businesses and challenge our cyber-security. Businesses need to be ready for a quantum future because it’s coming.”

In his talk, Professor O’Brien will outline the current status of quantum computing and its potential applications and he will reveal his architectural blue-print for a manufacturable photonic quantum computer, showing all the components and a roadmap toward building a practical machine.

Quantum technologies offer ultra-secure communications, sensors of unprecedented precision and computers that are exponentially more powerful than any supercomputer for a given task. These technologies are destined to fundamentally change our lives and the first commercially available quantum devices are only now beginning to emerge.

As the holder of a prestigious Royal Academy of Engineering Chair in Quantum Engineering and an EPSRC RISE leader, Professor O’Brien has a ten year vision to engineer new quantum technologies that will inevitably disrupt todays ICT models, creating new businesses and valuable new markets.

The World Economic Forum (WEF) Annual Meeting of business and political leaders will take place from 20-23 January 2016 in Davos, Switzerland.

Monday, June 8, 2015

Just add water: Stanford engineers develop a computer that operates on water droplets




Manu Prakash, an assistant professor of bioengineering at Stanford, and his students have developed a synchronous computer that operates using the unique physics of moving water droplets. Their goal is to design a new class of computers that can precisely control and manipulate physical matter.

Video by Kurt Hickman

Stanford bioengineer Manu Prakash and his students have developed a synchronous computer that operates using the unique physics of moving water droplets.

Computers and water typically don't mix, but in Manu Prakash's lab, the two are one and the same. Prakash, an assistant professor of bioengineering at Stanford, and his students have built a synchronous computer that operates using the unique physics of moving water droplets.

The computer is nearly a decade in the making, incubated from an idea that struck Prakash when he was a graduate student. The work combines his expertise in manipulating droplet fluid dynamics with a fundamental element of computer science – an operating clock.

"In this work, we finally demonstrate a synchronous, universal droplet logic and control," Prakash said.

Because of its universal nature, the droplet computer can theoretically perform any operation that a conventional electronic computer can crunch, although at significantly slower rates. Prakash and his colleagues, however, have a more ambitious application in mind.

"We already have digital computers to process information. Our goal is not to compete with electronic computers or to operate word processors on this," Prakash said. "Our goal is to build a completely new class of computers that can precisely control and manipulate physical matter. Imagine if when you run a set of computations that not only information is processed but physical matter is algorithmically manipulated as well. We have just made this possible at the mesoscale."

The ability to precisely control droplets using fluidic computation could have a number of applications in high-throughput biology and chemistry, and possibly new applications in scalable digital manufacturing.

The results are published in the current edition of Nature Physics.

The crucial clock
For nearly a decade since he was in graduate school, an idea has been nagging at Prakash: What if he could use little droplets as bits of information and utilize the precise movement of those drops to process both information and physical materials simultaneously. Eventually, Prakash decided to build a rotating magnetic field that could act as clock to synchronize all the droplets. The idea showed promise, and in the early stages of the project, Prakash recruited a graduate student, Georgios "Yorgos" Katsikis, who is the first author on the paper.





Computer clocks are responsible for nearly every modern convenience. Smartphones, DVRs, airplanes, the Internet – without a clock, none of these could operate without frequent and serious complications. Nearly every computer program requires several simultaneous operations, each conducted in a perfect step-by-step manner. A clock makes sure that these operations start and stop at the same times, thus ensuring that the information synchronizes.

The results are dire if a clock isn't present. It's like soldiers marching in formation: If one person falls dramatically out of time, it won't be long before the whole group falls apart. The same is true if multiple simultaneous computer operations run without a clock to synchronize them, Prakash explained.

"The reason computers work so precisely is that every operation happens synchronously; it's what made digital logic so powerful in the first place," Prakash said.

A magnetic clock
Developing a clock for a fluid-based computer required some creative thinking. It needed to be easy to manipulate, and also able to influence multiple droplets at a time. The system needed to be scalable so that in the future, a large number of droplets could communicate amongst each other without skipping a beat. Prakash realized that a rotating magnetic field might do the trick.

Katsikis and Prakash built arrays of tiny iron bars on glass slides that look something like a Pac-Man maze. They laid a blank glass slide on top and sandwiched a layer of oil in between. Then they carefully injected into the mix individual water droplets that had been infused with tiny magnetic nanoparticles.

Next, they turned on the magnetic field. Every time the field flips, the polarity of the bars reverses, drawing the magnetized droplets in a new, predetermined direction, like slot cars on a track. Every rotation of the field counts as one clock cycle, like a second hand making a full circle on a clock face, and every drop marches exactly one step forward with each cycle.

A camera records the interactions between individual droplets, allowing observation of computation as it occurs in real time. The presence or absence of a droplet represents the 1s and 0s of binary code, and the clock ensures that all the droplets move in perfect synchrony, and thus the system can run virtually forever without any errors.

"Following these rules, we've demonstrated that we can make all the universal logic gates used in electronics, simply by changing the layout of the bars on the chip," said Katsikis. "The actual design space in our platform is incredibly rich. Give us any Boolean logic circuit in the world, and we can build it with these little magnetic droplets moving around."

The current paper describes the fundamental operating regime of the system and demonstrates building blocks for synchronous logic gates, feedback and cascadability – hallmarks of scalable computation. A simple-state machine including 1-bit memory storage (known as "flip-flop") is also demonstrated using the above basic building blocks.

A new way to manipulate matter
The current chips are about half the size of a postage stamp, and the droplets are smaller than poppy seeds, but Katsikis said that the physics of the system suggests it can be made even smaller. Combined with the fact that the magnetic field can control millions of droplets simultaneously, this makes the system exceptionally scalable.

"We can keep making it smaller and smaller so that it can do more operations per time, so that it can work with smaller droplet sizes and do more number of operations on a chip," said graduate student and co-author Jim Cybulski. "That lends itself very well to a variety of applications."

Prakash said the most immediate application might involve turning the computer into a high-throughput chemistry and biology laboratory. Instead of running reactions in bulk test tubes, each droplet can carry some chemicals and become its own test tube, and the droplet computer offers unprecedented control over these interactions.

From the perspective of basic science, part of why the work is so exciting, Prakash said, is that it opens up a new way of thinking of computation in the physical world. Although the physics of computation has been previously applied to understand the limits of computation, the physical aspects of bits of information has never been exploited as a new way to manipulate matter at the mesoscale (10 microns to 1 millimeter).

Because the system is extremely robust and the team has uncovered universal design rules, Prakash plans to make a design tool for these droplet circuits available to the public. Any group of people can now cobble together the basic logic blocks and make any complex droplet circuit they desire.

"We're very interested in engaging anybody and everybody who wants to play, to enable everyone to design new circuits based on building blocks we describe in this paper or discover new blocks. Right now, anyone can put these circuits together to form a complex droplet processor with no external control – something that was a very difficult challenge previously," Prakash said.

"If you look back at big advances in society, computation takes a special place. We are trying to bring the same kind of exponential scale up because of computation we saw in the digital world into the physical world." 

Source: http://www.nanotechnologyworld.org/#!Just-add-water-Stanford-engineers-develop-a-computer-that-operates-on-water-droplets/c89r/557640b30cf2312d79770768 

Monday, June 2, 2014

Rensselaer Researchers Predict the Electrical Response of Metals to Extreme Pressures


Findings Published in the Proceedings of the National Academy of Sciences Could Have Applications in Computer Chip Design

Research published today in the Proceedings of the National Academy of Sciences makes it possible to predict how subjecting metals to severe pressure can lower their electrical resistance, a finding that could have applications in computer chips and other materials that could benefit from specific electrical resistance. 

The semiconductor industry has long manipulated materials like silicon through the use of pressure, a strategy known as “strain engineering,” to improve the performance of transistors. But as the speed of transistors has increased, the limited speed of interconnects – the metal wiring between transistors – has become a barrier to increased computer chip speed. The published research paper, “Pressure Enabled Phonon Engineering in Metals,” opens the door to a new variant of strain engineering that can be applied to the metal interconnects, and other materials used to conduct or insulate electricity. “We looked at a fundamental physical property, the resistivity of a metal, and show that if you pressurize these metals, resistivity decreases. And not only that, we show that the decrease is specific to different materials – aluminum will show one decrease, but copper shows another decrease,” said Nicholas Lanzillo, a doctoral candidate at Rensselaer Polytechnic Institute and lead author on the study. 

“This paper explains why different materials see different decreases in these fundamental properties under pressure.” The research involved theoretical predictions, use of a supercomputer, and experimentation with equipment capable of exerting pressures up to 40,000 atmospheres (nearly 600,000 pounds per square inch). It was made possible through a collaboration between three Rensselaer professors – Saroj Nayak, a professor of physics, applied physics, and astronomy; Morris Washington, associate director of the Center for Materials, Devices, and Integrated Systems and professor of practice of physics, applied physics, and astronomy; and E. Bruce Watson, Institute Professor of Science, and professor of earth and environmental sciences and of materials science and engineering – with a diverse mix of disciplinary backgrounds and skill sets. Jay Thomas, a senior research scientist in Watson’s lab, was primarily responsible for designing the complex experiments detailed in the paper. When an electrical current is applied to metal, electrons travel through a lattice structure formed by the individual metal atoms, carrying the current along the wiring. But as an electron travels, it is impeded by the normal collective vibration of atoms in the lattice, which is one of the factors that leads to electrical resistance. In physics, the vibration is called phonon, and the resistance it creates by coupling with electrons is known as electron-phonon coupling, a quantum mechanical feature that amplifies strongly at the atomic scale. 

Lanzillo and Nayak, his doctoral adviser, said earlier research using the Center for Computational Innovations – the Rensselaer supercomputer – showed that electron phonon coupling varies depending on the scale of the wiring: nanoscale wire has typically higher resistance than ordinary size, or “bulk,” wiring. “Our goal was to understand what limits the resistivity, what accounts for the different resistance at the atomic scale,” said Nayak. “Our earlier findings showed that sometimes the resistance of the same metal in bulk and at the atomic scale could change by a factor of 10. That’s a big number in terms of resistivity.” The researchers wanted to conduct experiments to confirm their findings, but doing so would have required making atomic-scale wires, and measuring the electron-phonon coupling as a current passed through the wire, both difficult tasks. Then they saw an alternative, based on the observation that atoms were closer together in the atomic scale lattice than in bulk lattice. “We theorized that if we compress the bulk wire, we might be able to create a condition where the atoms are closer to each other, to mimic the conditions at the atomic scale,” said Nayak. They approached Watson and Washington to execute an experiment to test their finding. 

Washington and Nayak have long collaborated through the New York State Interconnect Focus Center at Rensselaer, which researches new material systems for the next generation of interconnects in semiconductor integrated circuits with a strong interest in interconnects at dimensions of less than 20 nanometers. Existing experimental data indicated that the resistivity of copper – the current preferred interconnect material – increases as the wiring size dips below 50 nanometers. One goal of the center is to suggest materials and structures for integrated circuit interconnects smaller than 20 nanometers, which often involves fabricating and characterizing experimental thin film structures with the resources of the Rensselaer Micro and Nano Fabrication Clean Room. With this background, Washington was critical to coordinating the experimental research. 

To pressurize the metals, the group turned to Watson, a geochemist who routinely subjects materials to enormous pressures to simulate conditions in the depths of the Earth. Watson had never experimented with the electrical properties of metal wires under pressure – a process that posed a number of technical challenges. Nevertheless, he was intrigued by the theoretical findings, and he and Thomas worked together to design the high-pressure experiments that provided information on the electrical resistivity of aluminum and copper at pressures up to 20,000 atmospheres. Working together, the team was able to demonstrate that the theoretical calculations were correct. “The experimental results were vital to the study because they confirmed that Saroj and Nick’s quantum mechanical calculations are accurate – their theory of electron-phonon coupling was validated,” said Watson. “And I think we would all argue that theory backed up by experimental confirmation makes the best science.” The authors said the research offers a new and exciting capability to predict the response of the resistivity to pressure through computer simulations. The research demonstrates that changes in resistivity can be achieved in thin film nanowires by using strain in combination with existing semiconductor wafer fabrication techniques and material. 

Because of this work, the physical properties and performance of a large number of metals can be further explored in a computer, saving time and expense of wafer fabrication runs. Lanzillo said the results are a complete package. “We can make this prediction with a computer simulation but it’s much more salient if we can get experimental confirmation,” said Lanzillo. “If we can go to a lab and actually take a block of aluminum and a block of copper and pressurize them and measure the resistivity. And that’s what we did. We made the theoretical prediction, and then our friends and colleagues in experiment are able to verify it in the lab and get quantitatively accurate results in both.” Funding for the research was partially supported by the National Science Foundation Integrative Graduate Education in Research and Traineeship (IGERT) Fellowship, Grant No. 0333314, as well as the Interconnect Focus Center (MARCO program) of New York state. Computing resources provided by the Center for Computational Innovations at Rensselaer, partly funded by the state of New York.

Source: http://news.rpi.edu/content/2014/06/02/rensselaer-researchers-predict-electrical-response-metals-extreme-pressures

Monday, October 28, 2013

UNC neuroscientists discover new 'mini-neural computer' in the brain

This is a dendrite, the branch-like structure of a single
neuron in the brain. The bright object from the top is a
pipette attached to a dendrite in the brain...
Dendrites, the branch-like projections of neurons, were once thought to be passive wiring in the brain. But now researchers at the University of North Carolina at Chapel Hill have shown that these dendrites do more than relay information from one neuron to the next. They actively process information, multiplying the brain's computing power.
"Suddenly, it's as if the processing power of the brain is much greater than we had originally thought," said Spencer Smith, PhD, an assistant professor in the UNC School of Medicine.

His team's findings, published October 27 in the journal Nature, could change the way scientists think about long-standing scientific models of how neural circuitry functions in the brain, while also helping researchers better understand neurological disorders.

"Imagine you're reverse engineering a piece of alien technology, and what you thought was simple wiring turns out to be transistors that compute information," Smith said. "That's what this finding is like. The implications are exciting to think about."

Axons are where neurons conventionally generate electrical spikes, but many of the same molecules that support axonal spikes are also present in the dendrites. Previous research using dissected brain tissue had demonstrated that dendrites can use those molecules to generate electrical spikes themselves, but it was unclear whether normal brain activity involved those dendritic spikes. For example, could dendritic spikes be involved in how we see?

The answer, Smith's team found, is yes. Dendrites effectively act as mini-neural computers, actively processing neuronal input signals themselves.

Directly demonstrating this required a series of intricate experiments that took years and spanned two continents, beginning in senior author Michael Hausser's lab at University College London, and being completed after Smith and Ikuko Smith, PhD, DVM, set up their own lab at the University of North Carolina. They used patch-clamp electrophysiology to attach a microscopic glass pipette electrode, filled with a physiological solution, to a neuronal dendrite in the brain of a mouse. The idea was to directly "listen" in on the electrical signaling process.

"Attaching the pipette to a dendrite is tremendously technically challenging," Smith said. "You can't approach the dendrite from any direction. And you can't see the dendrite. So you have to do this blind. It's like fishing if all you can see is the electrical trace of a fish." And you can't use bait. "You just go for it and see if you can hit a dendrite," he said. "Most of the time you can't."

But Smith built his own two-photon microscope system to make things easier.

Once the pipette was attached to a dendrite, Smith's team took electrical recordings from individual dendrites within the brains of anesthetized and awake mice. As the mice viewed visual stimuli on a computer screen, the researchers saw an unusual pattern of electrical signals – bursts of spikes – in the dendrite.

Smith's team then found that the dendritic spikes occurred selectively, depending on the visual stimulus, indicating that the dendrites processed information about what the animal was seeing.
To provide visual evidence of their finding, Smith's team filled neurons with calcium dye, which provided an optical readout of spiking. This revealed that dendrites fired spikes while other parts of the neuron did not, meaning that the spikes were the result of local processing within the dendrites.
Study co-author Tiago Branco, PhD, created a biophysical, mathematical model of neurons and found that known mechanisms could support the dendritic spiking recorded electrically, further validating the interpretation of the data.

"All the data pointed to the same conclusion," Smith said. "The dendrites are not passive integrators of sensory-driven input; they seem to be a computational unit as well."
His team plans to explore what this newly discovered dendritic role may play in brain circuitry and particularly in conditions like Timothy syndrome, in which the integration of dendritic signals may go awry.

Source: http://news.unchealthcare.org/news/2013/october/unc-neuroscientists-discover-new-2018mini-neural-computer2019-in-the-brain

Wednesday, September 25, 2013

The First Carbon Nanotube Computer

A carbon nanotube computer processor is comparable to a chip from the early 1970s, and may be the first step beyond silicon electronics.

For the first time, researchers have built a computer whose central processor is based entirely on carbon nanotubes, an incredibly tiny form of carbon with remarkable material and electronic properties. The computer is slow and simple, but its creators, a group of Stanford University engineers, say it shows that carbon nanotube electronics are a viable potential replacement for silicon when it reaches its limits in ever-smaller electronic circuits.

The carbon nanotube processor is comparable in capabilities to the Intel 4004, that company’s first microprocessor, which was released in 1971, saysSubhasish Mitra, an electrical engineer at Stanford and one of the project’s co-leaders. The computer, described today in the journal Nature, runs a simple software instruction set called MIPS. It can switch between multiple tasks (counting and sorting numbers) and keep track of them, and it can fetch data from and send it back to an external memory.

The nanotube processor is made up of 142 transistors, each of which contains carbon nanotubes that are about 10 to 200 nanometer long. The Stanford group says it has made six versions of carbon nanotube computers, including one that can be connected to external hardware—a numerical keypad that can be used to input numbers for addition.

Aaron Franklin, a researcher at the IBM Watson Research Center in Yorktown Heights, New York, says the comparison with the 4004 and other early silicon processors is apt. “This is a terrific demonstration for people in the electronics community who have doubted carbon nanotubes,” he says.

Franklin’s group has demonstrated that individual carbon nanotube transistors—smaller than 10 nanometers—are faster and more energy efficient than those made of any other material, including silicon. Theoretical work has also suggested that a carbon nanotube computer would be an order of magnitude more energy efficient than the best silicon computers. And the nanomaterial’s ability to dissipate heat suggests that carbon nanotube computers might run blisteringly fast without heating up—a problem that sets speed limits on the silicon processors in today’s computers.

Still, some people doubt that carbon nanotubes will replace silicon. Working with carbon nanotubes is a big challenge. They are typically grown in a way that leaves them in a tangled mess, and about a third of the tubes are metallic, rather than semiconducting, which causes short-circuits.

Over the past several years, Mitra has collaborated with Stanford electrical engineer Philip Wong, who has developed ways to sidestep some of the materials challenges that have prevented the creation of complex circuits from carbon nanotubes. Wong developed a method for growing mostly very straight nanotubes on quartz, then transferring them over to a silicon substrate to make the transistors. The Stanford group also covers up the active areas of the transistors with a protective coating, then etches away any exposed nanotubes that have gone astray.

Wong and Mitra also apply a voltage to turn all of the semiconducting nanotubes on a chip to “off.” Then they pulse a large current through the chip; the metallic ones heat up, oxidize, and disintegrate. All of these nanotube-specific fixes—and the rest of the manufacturing process—can be done on the standard equipment that’s used to make today’s silicon chips. In that sense, the process is scalable.

Late last month at Hot Chips, an engineering design conference hosted, coincidentally, at Stanford, the director of the Microsystems Technology Office at DARPA made a stir by discussing the end of silicon electronics. In a keynote,Robert Colwell, former chief architect at Intel, predicted that by as early as 2020, the computing industry will no longer be able to keep making performance and cost improvements by doubling the density of silicon transistors on chips every 18 to 24 months—a feat dubbed Moore’s Law after the Intel cofounder Gordon Moore, who first observed the trend.


Mitra and Wong hope their computer shows that carbon nanotubes may be a serious answer to the question of what comes next. So far no emerging technologies come close to touching silicon. Of all the emerging materials and new ideas held up as possible saviors—nanowires, spintronics, graphene, biological computers—no one has made a central processing unit based on any of them, says Mitra. In that context, catching up to silicon’s performance circa 1970, though it leaves a lot of work to be done, is exciting.

Victor Zhirnov, a specialist in nanoelectronics at theSemiconductor Research Corporation in Durham, North Carolina, is much more cautiously optimistic. The nanotube processor has 10 million times fewer transistors on it than today’s typical microprocessors, runs much more slowly, and operates at five times the voltage, meaning it uses about 25 times as much power, he notes.

Some of the nanotube computer’s sluggishness is due to the conditions under which it was built—in an academic lab using what the Stanford group had access to, not an industry-standard factory. The processor is connected to an external hard drive, which serves as the memory, through a large bundle of electrical wires, each of which connects to a large metal pin on top of the nanotube processor. Each of the pins in turn connects to a device on the chip. This messy packaging means the data has to travel longer distances, which cuts into the efficiency of the computer.

With the tools at hand, the Stanford group also can’t make transistors smaller than about one micrometer—compare that with Intel’s announcement earlier this month that its next line of products will be built on 14-nanometer technology. If, however, the group were to go into a state-of-the-art fab, its manufacturing yields would improve enough to be able to make computers with thousands of smaller transistors, and the computer could run faster.

To reach the superb level of performance theoretically offered by nanotubes, researchers will have to learn how to build complex integrated circuits made up of pristine single nanotube transitors. Franklin says device and materials experts like his group at IBM need to start working in closer collaboration with circuit designers like those at Stanford to make real progress.

“We are well aware that silicon is running out of steam, and within 10 years it’s coming to its end,” says Zhirnov. “If carbon nanotubes are going to become practical, it has to happen quickly.”

Source: