Put a cat in a box, he proposed, and rig up a Rube Goldberg contraption involving a hammer, a vial of poison and a quantum triggering device. If an electron is in one position, the hammer will remain safely cocked. But if the electron moves into the opposite location, the hammer will drop, smashing the vial and killing the cat.



The laws of quantum mechanics hold that as long as the electron remains undisturbed, it hangs in limbo, occupying both its possible states. The cat, by extension, is both dead and alive.
Scientists still disagree over just what to make of Schrodinger's thought experiment. But that has not stopped them from exploiting the bizarre rules of quantum physics to make computers whose tiny registers can be both on and off — registering 1 and 0 at the same time. As hard as it is to fathom, theorists have proved that such a machine — a quantum computer — could perform multitudes of calculations simultaneously, leaving the mightiest supercomputers choking in the dust.
In the latest of a steady stream of small developments, researchers in the Netherlands and Japan reported in the journal Science last week that they had caused an electrical current in a superconducting ring to flow simultaneously clockwise (representing 1) and counterclockwise (0). The result was a "qubit," a quantum representation of both the digits of binary arithmetic. In other labs, qubits have been devised from single atoms. Whatever is used as the quantum abacus beads, the result is an exponential explosion in computing power.
Ordinarily a row of 10 bits (think of them as tiny switches turned on or off) can hold any one of a thousand different numbers (1,024 to be exact). But a row of 10 qubits, because of its quantum nature, can hold all the numbers at once. To find the square root of every number from 1 to 1,000, you would load them all onto a row of 10 atoms, perform a single calculation, and — voila — all the answers would appear.
Every time you add a qubit to the string, the computing power doubles. A row of 11 atoms will carry out 2,048 simultaneous calculations, and a row of 12 will do 4,096. By the time you get to just 14 atoms, a speck still far too tiny to see, you can do more calculations in tandem (16,384) than the fastest supercomputer in the U.S.a machine at Los Alamos National Laboratory so voracious that it draws several megawatts of power.
Mathematicians have proved that a quantum computer with thousands of calculating atoms could rapidly find the factors of numbers hundreds of digits long — a problem that would take the best conventional supercomputers billions of years. Since the codes used to protect corporate and military secrets are based on factoring, this development is of more than academic interest. Program a row of atoms to scan huge databases of information, and the result could be, among other things, the ultimate chess master, a quantum Deep Blue.
Of course, what is true in theory may turn out to be impossible in practice. As one researcher, Lov Grover of Lucent, put it, "We're writing the software for a device that does not yet exist."
Though nothing in the laws of physics rules out quantum computers, qubits are maddeningly delicate. Experiments must be done at temperatures near absolute zero, and the slightest disturbance can cause the teetering quantum states to collapse. Given such obstacles, quantum computing's accomplishments have so far been rather modest: using a short string of atoms to find the factors of the number 15 or to search a "database" of eight items.
What happens from here on out depends on the kind of technological advances that are always hard to predict. About the same time that Schrodinger unleashed his quantum cat, the British mathematician Alan Turing was sketching out the theory of the modern digital computer. A decade later, during World War II, Turing was helping program Colossus, a roomsize electronic calculating machine that used 1,800 vacuum tubes to crack German codes. The abstraction had become real.
George Johnson's newest book, A Shortcut Through Time: The Path to the Quantum Computer, was published this month by Alfred A. Knopf