acm-header
Sign In

Communications of the ACM

ACM Opinion

Why Google's Quantum Supremacy Milestone Matters


View as: Print Mobile App Share:
Google A.I. Quantums Sycamore processor.

Google officially announced last week that it achieved the milestone of quantum supremacy.

Credit: Erik Lucero/Google

Google officially announced last week in the journal Nature that it achieved the milestone of "quantum supremacy." This phrase, coined by the physicist John Preskill in 2012, refers to the first use of a quantum computer to make a calculation much faster than we know how to do it with even the fastest supercomputers available. The calculation doesn't need to be useful: much like the Wright Flyer in 1903, or Enrico Fermi's nuclear chain reaction in 1942, it only needs to prove a point.

Over the last decade, together with students and colleagues, I helped develop much of the theoretical underpinning for quantum supremacy experiments like Google's. I reviewed Google's paper before it was published. So the least I can do is to try to explain what it means.

Until recently, every computer on the planet — from a 1960s mainframe to your iPhone, and even inventions as superficially exotic as "neuromorphic computers" and DNA computers — has operated on the same rules. These were rules that Charles Babbage understood in the 1830s and that Alan Turing codified in the 1930s. Through the course of the computer revolution, all that has changed at the lowest level are the numbers: speed, amount of RAM and hard disk, number of parallel processors.

But quantum computing is different. It's the first computing paradigm since Turing that's expected to change the fundamental scaling behavior of algorithms, making certain tasks feasible that had previously been exponentially hard. Of these, the most famous examples are simulating quantum physics and chemistry, and breaking much of the encryption that currently secures the internet.

Sign Up for Debatable

Agree to disagree, or disagree better? We'll help you understand the sharpest arguments on the most pressing issues of the week, from new and familiar voices.

 

Advertisement

Continue reading the main story

In my view, the Google demonstration was a critical milestone on the way to this vision. At a lab in Santa Barbara, Calif., a Google team led by John Martinis built a microchip called "Sycamore," which uses 53 loops of wire around which current can flow at two different energies, representing a 0 or a 1. The chip is placed into a dilution refrigerator the size of a closet, which cools the wires to a hundredth of a degree above absolute zero, causing them to superconduct. For a moment — a few tens of millionths of a second — this makes the energy levels behave as quantum bits or "qubits," entities that can be in so-called superpositions of the 0 and 1 states.

This is the part that's famously hard to explain. Many writers fall back on boilerplate that makes physicists howl in agony: "imagine a qubit as just a bit that can be both 0 and 1 at the same time, exploring both possibilities simultaneously." If I had room for the honest version, I'd tell you all about amplitudes, the central concept of quantum mechanics since Werner Heisenberg, Erwin Schrödinger and others discovered it in the 1920s.

Here's a short version: In everyday life, the probability of an event can range only from 0 percent to 100 percent (there's a reason you never hear about a negative 30 percent chance of rain). But the building blocks of the world, like electrons and photons, obey different, alien rules of probability, involving numbers — the amplitudes — that can be positive, negative, or even complex (involving the square root of -1). Furthermore, if an event — say, a photon hitting a certain spot on a screen — could happen one way with positive amplitude and another way with negative amplitude, the two possibilities can cancel, so that the total amplitude is zero and the event never happens at all. This is "quantum interference," and is behind everything else you've ever heard about the weirdness of the quantum world.

 

From The New York Times
View Full Article

 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account