University of Washington (UW) researchers warn that fast-growing computerized natural-language models can worsen environmental and social issues as the amount of training data required increases.
UW's Emily M. Bender and colleagues said the enormous energy consumption needed to drive the model language programs' computing muscle induces environmental degradation, with the costs borne by marginalized peoples.
Further, the massive scale of compute power can limit model access to only the most well-resourced enterprises and research groups.
Critically, such models can perpetuate hegemonic language because the computers read language from the Web and other sources, and can fool people into thinking they are having an actual conversation with a human rather than a machine.
Bender said, "It produces this seemingly coherent text, but it has no communicative intent. It has no idea what it's saying. There's no there there."
From UW News
View Full Article
Abstracts Copyright © 2021 SmithBucklin, Washington, DC, USA
No entries found