acm-header
Sign In

Communications of the ACM

ACM News

AI Chatbots Lose Money Every Time You Use Them. That's a Problem


View as: Print Mobile App Share:
ChatGPT running on a smartphone.

The tech giants staking their future on AI rarely discuss the technology’s cost.

Credit: Gabby Jones/Bloomberg News

The enormous cost of running today's large language models, which underpin tools like ChatGPT and Bard, is limiting their quality and threatening to throttle the global AI boom they've sparked.

Their expense, and the limited availability of the computer chips they require, is also constraining which companies can afford to run them and pressuring even the world's richest companies to turn chatbots into moneymakers sooner than they may be ready to.

"The models being deployed right now, as impressive as they seem, are really not the best models available," said Tom Goldstein, a computer science professor at the University of Maryland. "So as a result, the models you see have a lot of weaknesses" that might be avoidable if cost were no object — such as a propensity to spit out biased results or blatant falsehoods.

From The Washington Post
View Full Article

 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account