News
Megan Garcia, a Florida mother whose oldest child, 14-year-old Sewell Setzer III, died by suicide after extensive ...
On 14 April 2023, 14-year-old Sewell Setzer III began using the app, mainly engaging with Game of Thrones bots like Daenerys and Rhaenyra Targaryen. He became obsessed, expressing his love for ...
12dOpinion
Stockhead on MSNYoung participants in AI revolution deserve to be safeWe can demand that our children’s interests take precedence over foreign corporate interests, writes Chloe Shorten.
In Sewell Setzer’s case, the chatbot ultimately seemed to encourage him to kill himself. Other reports have also surfaced of bots seeming to suggest or support suicide.
The case was brought against the company by Megan Garcia, the mother of 14-year-old Sewell Setzer III, who killed himself after conversing with a Character.AI chatbot roleplaying as Daenerys and ...
Garcia sued Character.ai in October after her 14-year-old son, Sewell Setzer III, died by suicide following prolonged interactions with a fictional character based on the Game of Thrones franchise.
"Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should." ...
Just because AI is becoming mainstream doesn't mean it's safe, especially when used by children who it has few guidelines to ...
Across Australia, kids are forming relationships with artificial intelligence companion bots much more dangerous than ...
A woman whose teen son died by suicide after troubling interactions with AI chatbots is pushing back against a ten-year ban ...
Proposals to install ChatGPT into a range of toys including Barbie dolls have sparked alarm from experts who branded it a ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results