Last month, researchers at OpenAI in San Francisco revealed an algorithm capable of learning, through trial and error, how to manipulate the pieces of a Rubik's Cube using a robotic hand. It was a remarkable research feat, but it required more than 1,000 desktop computers plus a dozen machines running specialized graphics chips crunching intensive calculations for several months.
The effort may have consumed about 2.8 gigawatt-hours of electricity, estimates Evan Sparks, CEO of Determined AI, a startup that provides software to help companies manage AI projects. That's roughly equal to the output of three nuclear power plants for an hour. A spokesperson for OpenAI questioned the calculation, noting that it makes several assumptions. But OpenAI declined to disclose further details of the project or offer an estimate of the electricity it consumed.
Artificial intelligence routinely produces startling achievements, as computers learn to recognize images, converse, beat humans at sophisticated games, and drive vehicles. But all those advances require staggering amounts of computing power—and electricity—to devise and train algorithms. And as the damage caused by climate change becomes more apparent, AI experts are increasingly troubled by those energy demands.
"The concern is that machine-learning algorithms in general are consuming more and more energy, using more data, training for longer and longer," says Sasha Luccioni, a postdoctoral researcher at Mila, an AI research institute in Canada.
From Wired
View Full Article
No entries found