In 1981, nobel Laureate Richard Feynman challenged the computing community to build a quantum computer. We have come a long way. In 2015, McKinsey estimated there were 7,000 researchers working on quantum computing, with a combined budget of $1.5 billion.20 In 2018, dozens of universities, approximately 30 major companies, and more than a dozen startups had notable R&D efforts.a Now seems like a good time to review the business.
How do quantum computers work? Quantum computers are built around circuits called quantum bits or qubits. One qubit can represent not just 0 or 1 as in traditional digital computers, but 0 or 1 or both simultaneously—a phenomenon called "superposition." A pair of qubits can represent four states, three qubits eight states, and so on. N qubits can represent 2" bits of information, and even 300 qubits can represent information equal to the estimated number of particles in the known universe.21 To perform calculations, qubits exploit superposition and "entanglement." This refers to when two quantum systems (such as an electron or a nucleus), once they interact, become connected and retain a specific correlation in their spin or energy states (which represent combinations of 0 and 1), even if physically separate. Entanglement makes it possible for quantum bits to work together and represent multiple combinations of values simultaneously, rather than represent one combination at a time. Once a calculation is finished, you observe the qubits directly as 0 or 1 values to determine the solution, as with a classical computer.
What are the technical hurdles? Qubits resemble hardwired logic gates usually made of atomic particles and superconductor materials chilled to near-absolute zero. A one-qubit system is not so difficult to build, but a quantum computer needs multiple qubits to do calculations, and at least 50 qubits to do anything useful.14 We might need 4,000 to 8,000 entangled qubits to surpass current encryption technology using very large integers.3 Programming the devices also requires specialized hardware design skills, not conventional software programing skills.3
Entangled qubits are difficult to use and scale because of another phenomenon called "decoherence." The specific correlations between quantum states can dissipate over time, thus destroying the ability of qubits to explore multiple solutions simultaneously. A useful analogy is to think of qubit outputs like smoke rings blown from a cigar.14 The rings can represent information but disintegrate (lose their "coherence") quickly. Since entangled qubits have a small probability of taking on different values due to external interactions, the computations require another process to detect and correct errors.
Figure. The D-Wave 2000Q chip, designed to run quantum computing problems, increases from 1,000 qubits to 2,000 qubits, allowing larger problems to be run—increasing the number of qubits yields an exponential increase in the size of the feasible search space.
How many different ways are there to build quantum computers? There are several competing technologies. D-Wave was founded in 1999 to accumulate patent rights in exchange for research grants.17 It has been funded mainly by venture capital, corporate investors such as Goldman Sachs, and more recently, Jeff Bezos and the CIA.13 The company has focused on "adiabatic quantum computing," also known as "quantum annealing." D-Wave used this approach to build a 28-qubit device in 2007 and has been marketing a 2,000-qubit device since 2017. Each D-Wave qubit is a separate lattice contained within a magnetic field of Josephson Junctions (logic circuits made of superconductor materials that exploit quantum tunneling effects) and couplers (which link the circuits and pass information). You program the device by loading mathematical equations into the lattices. The processor then explores all possible solutions simultaneously, rather than one at a time. The answer that requires the lowest energy represents the optimal solution.10 However, some critics note that D-Wave qubits do not all seem to work together or exhibit quantum entanglement, and may not operate faster than conventional computers.4
Google and IBM, as well as startups such as Quantum Circuits and Righetti Computing, deploy a different logic-gate approach, using entangled electrons or nuclei.19 Xanadu, a Toronto startup, uses photons.b Microsoft's design relies on quasi-particles called anyons. Arranged into "topological qubits," these resemble braided knots on a string, with (theoretically) high levels of stability and coherence. Microsoft plans to build a device within five years and make it commercially available via the cloud.1,16
Who leads in the patent race? Patent-related publications have increased from a handful in the 1990s to more than 400 per year in 2016–2017. The U.S. leads with approximately 800 total patents, three to four times the numbers from Japan and China. The company with the largest portfolio is D-Wave, followed by IBM (which started research in 1990) and then Microsoft. IBM leads in annual patent filings. At universities, the leaders in patent applications are MIT, Harvard, Zhejiang (China), Yale, and Tsinghua (China).2
What are some applications where quantum computers should excel? Experts list mathematical problems that require massive parallel computations such as in optimization and simulation, cryptography and secure communications, pattern matching and bigdata analysis, and artificial intelligence and machine learning.
D-Wave computers seem to generate "good enough" solutions to complex combinatorial optimization problems with many potential solutions. For example, in 2012, Harvard researchers used a D-Wave computer to do complex simulations of protein molecule unfolding (useful in drug discovery).22 Since 2013, NASA and Google, along with several universities, have been using D-Wave computers in their joint Quantum AI Lab.7 The Lab has explored Web search, speech recognition, planning and scheduling, and operations management.9 Since 2014, Northrup-Grumman has been using D-Wave to simulate large-scale software systems behavior (useful for error detection).4 Volkswagen, BMW, and Google are relying on D-Wave to analyze the huge amounts of data needed for self-driving cars. In 2017, Volkswagen used a $15-million D-Wave computer accessed via the cloud to optimize the airport routes of 10,000 taxis in Beijing. The machine processed GPS data in seconds that would normally take a computer 45 minutes. The programming took six months, however, and some experts doubt the results, which have not been published in a scientific journal.6,11
Perhaps the "killer app" will be quantum encryption and secure communications. These applications utilize an algorithm discovered in 1994 by Peter Shor, formerly of Bell Labs and now at MIT. Shor demonstrated how to use a quantum computer to factor very large numbers. Entanglement also makes it possible to have unbreakable cryptographic keys across different locations. Governments (the U.S. and China in particular) as well as companies (AT&T, Alibaba, BT, Fujitsu, HP, Huawei, Mitsubishi, NEC, Raytheon, and Toshiba, among others) have been pursuing these applications.c China seems especially advanced.18
Do quantum computers represent a new general-purpose computing "platform?" No. Quantum computers are special-purpose devices that exploit quantum phenomena for massively parallel computations. They are not suited to everyday computing tasks that require speed, precision, and ease of use at low cost. The competing technologies also seem useful for different applications, and so multiple types of quantum computers may persist, splitting potential application ecosystems. D-Wave computers tackle optimization and simulation problems. They cannot run Shor's algorithm, and so may not be not useful for cryptography or quantum communications. IBM, Google, and Microsoft, as well as several startups, are designing more general-purpose devices, but these are still theoretical, experimental, or small scale.
Perhaps the "killer app" will be quantum encryption and secure communications.
For the business to progress faster, more people need access to bigger quantum computers so they can build better programming tools and test real-world applications. Toward this end, IBM has made small quantum computers available via the cloud and is heading toward bigger devices; users have already run approximately 300,000 experiments.12,15 Google has made its D-Wave computer available to researchers as a cloud service.8 Google is also designing bigger machines with a different technology. Microsoft announced in 2017 that it would offer up to 40 qubits via a simulator on the Azure cloud. Microsoft has also created a quantum programming language called Q# and integrated this with Visual Studio.3,d However, Microsoft has not yet built physical devices and the programming language may be completely specific to its architecture.5
In short, quantum computing still resembles conventional computing circa the late 1940s and early 1950s. We have laboratory devices and some commercial products and services, but mostly from one company. We have incompatible architectures still in the research stage, with different strengths and weaknesses. All the machines require specialized skills to build and program. Companies still work closely with universities and national laboratories. There is no consensus as to what is the best technology or design. D-Wave led the first generation but its computers are technically limited and scientifically controversial. Although D-Wave should survive as a niche player, IBM and Google seem more likely to dominate the next generation, with Microsoft and maybe a startup or two close on their heels.12
1. Bisson, S. Inside Microsoft's quantum computing world. InfoWorld (Oct. 17, 2017).
2. Brachman, S. U.S. leads world in quantum computing patent filings with IBM leading the charge. IP Watchdog (Dec. 4, 2017).
3. Bright, P. Microsoft makes play for next wave of computing with quantum computing toolkit. Ars Technica (Sept. 25, 2017).
4. Brooks, M. Quantum computers buyers' guide: Buy one today. New Scientist (Oct. 15, 2014).
5. Campbell, F. Microsoft's quantum computing vaporware. Forbes.com (Dec. 18, 2017).
6. Castellanos, S. Companies look to make quantum leap with new technology. The Wall Street Journal (May 6, 2017).
7. Choi, C. Google and NASA launch quantum computing AI lab. MIT Technology Review (May 16, 2013).
8. Condon, S. Google takes steps to commercialize quantum computing. ZDNet (July 17, 2017).
9. D-Wave Systems, Inc. D-Wave 2000Q system to be installed at quantum artificial intelligence lab run by Google, NASA, and Universities Space Research Association. Press Release (Mar. 13, 2017).
10. D-Wave Systems, Inc. Introduction to the D-Wave quantum hardware; https://bit.ly/2FzstKS
11. Ewing, J. BMW and Volkswagen try to beat Google and Apple at their own game. The New York Times (June 22, 2017).
12. Grossman, L. Quantum leap. Time (Feb. 17, 2014).
13. Guedim, Z. 11 Companies set for quantum leap in computing. EdgyLabs (Oct. 12, 2017).
14. Hardy, Q. A strange computer promises great speed. The New York Times (Mar. 21, 2013).
15. Knight, W. Serious quantum computers are finally here. What are we going to do with them? MIT Technology Review (Feb. 21, 2018).
16. Lee, C. How IBM's new five qubit univeral quantum computer works. Ars Technica (May 4, 2016).
17. Linn, A. The future is quantum: Microsoft releases free preview of quantum development kit. (Dec. 11, 2017); https://bit.ly/2C3fxv3
18. MacCormack, A., Agrawal, A., and Henderson, R. D-Wave systems: Building a quantum computer. Harvard Business School Case #9-604-073 (Apr. 2004), Boston, MA.
19. Matthews, O. How China is using quantum physics to take over the world and stop hackers. Newsweek (Oct. 30, 2017).
20. Metz, C. Yale professors race Google and IBM to the first quantum computer. The New York Times (Nov. 13, 2017).
21. Palmer, J. Here, there, and everywhere: Quantum technology is beginning to come into its own. The Economist (May 20, 2018).
22. Veritasium. How does a quantum computer work? (June 17, 2013); https://bit.ly/1ApDtjk
23. Wang, B. Dwave adiabatic quantum computer used by Harvard to solve protein folding problems. Next Big Future (Aug. 16, 2012).
The Digital Library is published by the Association for Computing Machinery. Copyright © 2018 ACM, Inc.
No entries found