The assumption that the cost of training computers is declining with the doubling of computing power every two years does not always hold true.
The OpenAI research firm said accelerating demand caused power requirements for training large models to skyrocket 300,000-fold by 2018, and they now double every 3.5 months.
Facebook's Jerome Pesenti said one round of training for the biggest models can cost "millions of dollars" in electrical power.
Increasing demand for computing power has fueled an explosion in processor design and specialized devices that can efficiently perform artificial intelligence (AI) calculations.
With Moore's Law approaching its physical limits, scientists are pursuing alternate approaches for boosting power, like quantum and neuromorphic computing.
AI researchers will have to extract as much performance as possible from existing technologies for the time being, but some expect specialized hardware and modified software to improve processing speeds.
From The Economist
View Full Article
Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA
No entries found