Its had an unprecedented run. It ushered in the Information Age and remains at the center of an unparalleled economic boom. But microcircuitry is running a race it cannot win. Moores Law, which brashly and accurately predicted the doubling of com-puter capabilities every year, is bumping up against the laws of physics, and physics is unlikely to yield. Ironically, the very thing that sustained Moores Law for so long is now prescribing the limits of silicon-based semiconductor technology. The culprit is miniaturization.
Chip technology has always relied on miniaturization for increases in speed and capacity. But we are rapidly approaching an impasse. While chip developers strain to squeeze the last ounce of blood from the miniaturization turnip, fiscal and physical realities are converging to limit what is practical and possible.
The most daunting problems lie in the murky borderlands between Newtonian physics and quantum mechanicsrealms governed by different and often contradictory rules. As chip designers pack semiconductors with components measured in nanometers (billionths of a meter), they begin to straddle the worlds of conventional and quantum physics, in which the elements of matter behave in fanciful and counterintuitive ways. When, for example, transistors are shrunk to 50 nanometers or less, electrons begin to wander and behave erratically. At such diminutive extremes, semiconductor gates measure a slight two nanometers and can no longer predictably block electron flow. In the quantum world, electrons are capable of tunneling through such bantam barriers and materializing on the other side. No one really understands why or how this happens, but the motility of electrons renders the device useless.
While herding electrons is among the principal challenges facing semiconductor developers, the exponential cost of manufacturing may soon override all other concerns. Consider that back in 1965, when Gordon Moore postulated the number of devices on a microchip would roughly double each year, a state-of-the-art chip held but 64 transistors. Today, Intels Pentium III processor holds an astonishing 28 million. Not surprisingly, the smaller and more complex the chip, the greater the cost and difficulty of producing it. Manufacturing facilities that once cost several million dollars now run into the billions.
How much more density and complexity the manufacturing process can stand will be largely determined by profitability. As expenses begin to outstrip revenues, innovation becomes firmly tethered to fiscal reality.
None of this bodes well for the economy, which owes much of its explosive gains in wealth and productivity to the steroidal growth of computing power. If computer technology is the engine pulling the economic train, then the prospect of Moores Law fizzling before a viable alternative to silicon is found is creating economic anxiety. The uneasy question is, whats next?
Some fascinating answers are provided by MIT in its publication Technology Review. The May/June 2000 issue is devoted to the future of computing beyond silicon and offers an engrossing glimpse into four emerging gee-whiz technologies: molecular computing, DNA computing, biological computing, and quantum computing.
Molecular computing, as the name suggests, addresses the problems of miniaturization at the atomic level. Its aim is to fashion molecules that can duplicate the functions of components found on microcircuits.
Mimicking computing operations requires creating three basic elements: logic, comprised of switching devices, typically transistors; some form of storage or memory; and a method of connecting absurdly large numbers of each.
Last year, researchers from Hewlett-Packard and UCLA built the first molecular logic, an electronic switch fashioned from a layer of several million molecules. Significantly, they were also able to link a few switches to produce a crude version of an AND gate. The logic function, however, was limited to a single use before becoming inoperative.
Shortly thereafter, researchers from Rice and Yale, led by Mark Reed head of Yales electrical engineering department, improved the basic design by creating a reusable switch comprising about only 1,000 molecules. They also successfully fabricated molecular memory. Using specially synthesized organic matter, they fashioned individual molecules that could change their conductivity by storing electrons on demandthus creating a binary equivalent.
The prospect of storage densities a million times greater than silicon, is among the reasons IBM is conducting research in the field. But equally attractive is the promise of affordable manufacturing. In Reeds words, ...molecular devices are astonishingly
easyand potentially cheapto make; although he warns there are molecular mountains yet to climb. Reed cautions, Before we can build complete, useful circuits we must find a way to secure many millions, if not billions, of molecular devices of various types against some kind of immobile surface and to link them in any manner into whatever patterns our circuit diagrams dictate. The technology is still too young to say for sure whether this monumental challenge will ever be surmounted.
Even as Reed was expressing his concerns, scientists at Bell Labs were busy constructing the first DNA motors and, coincidentally, solving the problem of how to assemble molecular components. DNA computing is a melding of biology and computer science and shows potential for massive-scale number-crunching and for building things like self-assembling circuit boards or molecular motors. The term motors in this context is somewhat misleading. What researchers have built more accurately resembles a pair of tweezers, assembled by mixing three strands of designer DNA in a test tube. Normally, DNA will bind only with other complementary DNA strands, forming the now familiar double-stranded helix. The breakthrough permitted researchers to program strands to assemble themselves in a desired configuration. By extension, entire electronic systems could, in theory, be designed to self-assemble in a test tube. Molecular components, marked with a unique DNA label and mixed in a beaker, would locate complementary labels and assemble into what Oxford physicist Andrew Turberfield calls an ordered, designed device.
So far, however, such rudimentary ordered devices have produced chaotic results. In computing, there is a classic problem that grinds even large processors to a halt. A traveling salesman must visit seven cities connected by 14 one-way flights. The computers job is to map the most efficient route. Leonard Adleman, a scientist and cryptographer at USC, created strands of DNA to represent each flight, then tossed them
into a test tube. The resulting molecular frenzy produced an astounding number of possible routes but was unable to single out the preferred one. Technology Review reports that
although the DNA in one-fiftieth of a teaspoon produced 100 trillion answers in less than one second, most of those answers were repeatsand most of them were incorrect.
Extracting the right answer requires a lot of additional lab work, making DNA computing highly impractical. But Erik Winfree, at the California Institute of Technology, appears to have mastered plucking the needle from the molecular haystack. Winfree invented what are known as DNA tiles capable of doing mathematical operations and providing correct answers by fitting together in specific ways. DNA tiles are constructed from three or more strands of DNA, then coded on the edges so that they couple only with complementary tiles in precise and predictable patterns. Both data and numbers can be represented by the couplings, and scientists are hard at work constructing a simple molecular abacus out of DNA tiles. Simple, when applied to computing at a subcellular level, is a relative term. If perfected, estimates are that a single tube of DNA tiles could perform about 10 trillion additions per secondabout a million times faster than an electronic computer.
What DNA computing attempts to do with the double helix, biological computing seeks to do with cells: turning living organisms into computational systems. Imagine a petri dish of bacteria madly computing away, with a million times more memory than todays largest computers. Dont expect biological computing to replace conventional business systems, but if cells can be programmed successfully, the worlds of IT and biochemistry will intersect powerfully. Every cell, as Technology Review notes, has a miniature chemical factory at its command: Once the organism [is] programmed, virtually any biological chemical could be synthesized at will. And the cost would be modest. Program a single cell, and you can grow a billion more for the cost of a simple nutrient solution.
Vats of cells could become tomorrows chemical factories; human cells could be programmed to inject insulin as needed into a diabetics bloodstream. A Band-Aid could acquire a doctors skill, capable of analyzing an injury and healing the damage. Toxic- spill-eating bacteria could be designed to lie dormant in the earth until an accidental discharge activated them. Researchers hope to develop a library of genetic applets that, when injected into a human host, could perform gene therapy, curing or preventing a wide range of diseases.
If you think the science of cellular and subcellular computing is a lot to wrap your mind around, follow Alice through the rabbit hole to quantum computing. This is a world in which an electron can be in two places at once, in which an atomic nucleus can be spinning clockwise and counterclockwise at the same time. But only if it is not observed. Quantum computing is based on the spin of subatomic particles called qubits. Spinning one way, a qubit represents a binary 0. Spinning the opposite way, a 1. But a qubit can also spin in two directions simultaneously, representing both 0 and 1. Thus, three qubits can represent all possible three-digit numbers. Given enough qubits, the advantage is the ability to process every possible input simultaneouslythe most perfect form of parallel processing imaginable.
IBM recently announced a working 5-qubit quantum system. Researchers at Los Alamos have constructed a 7-qubit computer. Hundreds of qubits will be needed to construct a useful system.
Max Born, who debated quantum physics with Einstein, observed that progress in physics has always moved from the intuitive to the abstract. It is a wonder of science that the abstract is being made real. Perhaps Moores Law hasnt fizzled after all; its changing venues.
LATEST COMMENTS
MC Press Online