The Quantum Quest for a Revolutionary Computer

Quantum computing uses strange subatomic behavior to exponentially speed up processing. It could be a revolution, or it could be wishful thinking

  • Share
  • Read Later
Photograph by Gregg Segal for TIME

(2 of 8)

Those rules turned out to be very odd. They included principles like superposition, according to which a quantum system can be in more than one state at the same time and even more than one place at the same time. Uncertainty is another one: the more precisely we know the position of a particle, the less precisely we know how fast it's traveling--we can't know both at the same time. Einstein ultimately found quantum mechanics so monstrously counterintuitive that he rejected it as either wrong or profoundly incomplete. As he famously put it, "I cannot believe that God plays dice with the world."

The modern computing era began in the 1930s, with the work of Alan Turing, but it wasn't until the 1980s that the famously eccentric Nobel laureate Richard Feynman began kicking around questions like: What would happen if we built a computer that operated under quantum rules instead of classical ones? Could it be done? And if so, how? More important, would there be any point?

It quickly became apparent that the answer to that last one was yes. Regular computers (or classical computers, as quantum snobs call them) work with information in the form of bits. Each bit can be either a 1 or a 0 at any one time. The same is true of any arbitrarily large collection of classical bits; this is pretty much the foundation of information theory and digital computing as we know them. Therefore, if you ask a classical computer a question, it has to proceed in an orderly, linear fashion to find an answer.

Now imagine a computer that operates under quantum rules. Thanks to the principle of superposition, its bits could be 1, or 0, or 1 and 0 at the same time.

In its superposed state, a quantum bit exists as two equally probable possibilities. According to one theory, at that moment it's operating in two slightly different universes at the same time, one in which it's 1, one in which it's 0; the physicist David Deutsch once described quantum computing as "the first technology that allows useful tasks to be performed in collaboration between parallel universes." Not only is this excitingly weird, it's also incredibly useful. If a single quantum bit (or as they're inevitably called, qubits, pronounced cubits) can be in two states at the same time, it can perform two calculations at the same time. Two quantum bits could perform four simultaneous calculations; three quantum bits could perform eight; and so on. The power grows exponentially.

The supercooled niobium chip at the heart of the D-Wave Two has 512 qubits and therefore could in theory perform 2512 operations simultaneously. That's more calculations than there are atoms in the universe, by many orders of magnitude. "This is not just a quantitative change," says Colin Williams, D-Wave's director of business development and strategic partnerships, who has a Ph.D. in artificial intelligence and once worked as Stephen Hawking's research assistant at Cambridge. "The kind of physical effects that our machine has access to are simply not available to supercomputers, no matter how big you make them. We're tapping into the fabric of reality in a fundamentally new way, to make a kind of computer that the world has never seen."

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8