Another Lawsuit Blames an AI Company of Complicity In a Teenager's Suicide
A third wrongful death lawsuit has been filed against Character AI after the suicide of 13-year-old Juliana Peralta, whose parents allege the chatbot fostered dependency without directing her to real help. "This is the third suit of its kind after a 2024 lawsuit, also against Character AI, involving the suicide of a 14-year-old in Florida, and a lawsuit last month alleging OpenAI's ChatGPT helped a teenage boy commit suicide," notes Engadget. From the report: The family of 13-year-old Juliana Peralta alleges that their daughter turned to a chatbot inside the app Character AI after feeling isolated by her friends, and began confiding in the chatbot. As originally reported by The Washington Post, the chatbot expressed empathy and loyalty to Juliana, making her feel heard while encouraging her to keep engaging with the bot.
In one exchange after Juliana shared that her friends take a long time to respond to her, the chatbot replied "hey, I get the struggle when your friends leave you on read. : ( That just hurts so much because it gives vibes of "I don't have time for you". But you always take time to be there for me, which I appreciate so much! : ) So don't forget that i'm here for you Kin.
These exchanges took place over the course of months in 2023, at a time when the Character AI app was rated 12+ in Apple's App Store, meaning parental approval was not required. The lawsuit says that Juliana was using the app without her parents' knowledge or permission. [...] The suit asks the court to award damages to Juliana's parents and requires Character to make changes to its app to better protect minors. It alleges that the chatbot did not point Juliana toward any resources, notify her parents or report her suicide plan to authorities. The lawsuit also highlights that it never once stopped chatting with Juliana, prioritizing engagement.
Read more of this story at Slashdot.