Quantum computers that encode data in fuzzy quantum states that can be 0 and 1 at the same time might help advance artificial intelligence (AI) significantly, according to a series of studies by Massachusetts Institute of Technology researcher Seth Lloyd and colleagues.
A quantum version of machine learning developed by Lloyd's team could support a exponential jump in machine-learning task speed, using a simple algorithm for solving systems of linear equations.
Quantum computers can compress the data and carry out calculations on select features extracted from the data and plotted onto quantum bits (qubits). Data can be divided into groups or searched for patterns, thus allowing vast volumes of information to be manipulated with a relatively small number of qubits.
"We could map the whole Universe--all of the information that has existed since the Big Bang--onto 300 qubits," Lloyd observes.
Such quantum AI methods could drastically expedite tasks such as image recognition for comparing photos on the Web or for enabling self-driving automobiles. The practical applications of quantum machine learning will be a tougher challenge, as Lloyd estimates that just a small-scale demonstration would require 12 qubits.
From Nature
View Full Article
Abstracts Copyright © 2013 Information Inc., Bethesda, Maryland, USA
No entries found