acm-header
Sign In

Communications of the ACM

ACM News

Generative AI Tools Quickly 'Running Out of Text' to Train Themselves


View as: Print Mobile App Share:
A Berkeley professor said AI's strategy behind training large language models is "starting to hit a brick wall."

OpenAI's ChatGPT is among many chatbots trained on large language models that may be "running out of text" to train on, said Stuart Russell, a computer science professor at the University of California, Berkeley.

Credit: Beata Zawrzel/NurPhoto/Getty Images

ChatGPT and other AI-powered bots may soon be "running out of text in the universe" that trains them to know what to say, an artificial intelligence expert and professor at the University of California, Berkeley says.

Stuart Russell said that the technology that hoovers up mountains of text to train artificial intelligence bots like ChatGPT is "starting to hit a brick wall." In other words, there's only so much digital text for these bots to ingest, he told an interviewer last week from the International Telecommunication Union, a UN communications agency.

This may impact the way generative AI developers collect data and train their technologies in the coming years, but Russell still thinks AI will replace humans in many jobs that he characterized in the interview as "language in, language out."

Russell's predictions widen the growing spotlight being shone in recent weeks on the data harvesting conducted by OpenAI and other generative AI developers to train large language models, or LLMs.

From Business Insider
View Full Article

 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account