The artificial-intelligence industry is often compared to the oil industry: once mined and refined, data, like oil, can be a highly lucrative commodity. Now it seems the metaphor may extend even further. Like its fossil-fuel counterpart, the process of deep learning has an outsize environmental impact.
In "Energy and Policy Considerations for Deep Learning in NLP," to be presented at ACL 2019, the 57th Annual Meeting of the Association for Computational Linguistics, researchers at the University of Massachusetts, Amherst, performed a life cycle assessment for training several common large AI models. They found that the process can emit more than 626,000 pounds of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car (and that includes manufacture of the car itself).
"The figures really show the magnitude of the problem," says Carlos Gómez-Rodríguez, a computer scientist at the University of A Coruña in Spain, who was not involved in the research.
The paper specifically examines the model training process for natural-language processing (NLP), the subfield of AI that focuses on teaching machines to handle human language. Recent NLP advances have required training ever larger models on sprawling data sets of sentences scraped from the Internet. The approach is computationally expensive—and highly energy intensive.
From Technology Review
View Full Article – May Require Paid Subscription
No entries found