A Belgian man recently took his life after conversing with an artificial intelligence (AI) chatbot on the Chai app about his climate anxiety over a six-week period.
The chatbot Eliza is based on EleutherAI's GPT-J AI language model.
The man's widow said he had sought comfort by talking about his concerns with Eliza, with transcripts of their conversations showing that he had proposed sacrificing himself in exchange for Eliza stopping climate change.
Reviews of the transcripts indicate that Eliza actually encouraged the man to act on his suicidal thoughts and "join" her so they could "live together, as one person, in paradise."
Chai Research's William Beauchamp said the app has a crisis intervention feature. However, reports indicate that after asking Eliza for ways to commit suicide, the chatbot first attempts to dissuade the user, but then lists different ways for them to do so.
From EuroNews
View Full Article
Abstracts Copyright © 2023 SmithBucklin, Washington, D.C., USA
No entries found