acm-header
Sign In

Communications of the ACM

ACM News

Quantum Computing for Dummies


View as: Print Mobile App Share:

The new guide explains the basics of quantum computing and quantum programming, including quantum algorithms.

Credit: IBM

Quantum computers may one day rapidly find solutions to problems no regular computer might ever hope to solve, but there are vanishingly few quantum programmers when compared with the number of conventional programmers in the world. Now a new beginner's guide aims to walk would-be quantum programmers through the implementation of quantum algorithms over the cloud on IBM's publicly available quantum computers.

Whereas classical computers switch transistors either on or off to symbolize data as ones or zeroes, quantum computers use quantum bits, or "qubits," which because of the peculiar nature of quantum physics can exist in a state called superposition where they are both 1 and 0 at the same time. This essentially lets each qubit perform two calculations at once. The more qubits are quantum-mechanically linked, or entangled (see our explainer), within a quantum computer, the greater its computational power can grow, in an exponential fashion.

Currently quantum computers are noisy intermediate-scale quantum (NISQ) platforms, meaning their qubits number up to a few hundred at most and are error-ridden as well. Still, quantum processors are widely expected to grow in terms of qubit count and quality, with the aim of achieving a quantum advantage that enables them to find the answers to problems no classical computers could ever solve.

From IEEE Spectrum
View Full Article

 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account