Another family has filed a wrongful death lawsuit against popular AI chatbot tool Character AI. This is the third suit of its kind after a 2024 lawsuit, also against Character AI, involving the suicide of a 14-year-old in Florida, and a lawsuit last month alleging OpenAI’s ChatGPT helped a teenage boy commit suicide.
The family of 13-year-old Juliana Peralta alleges that their daughter turned to a chatbot inside the app Character AI after feeling isolated by her friends, and began confiding in the chatbot. As originally reported by The Washington Post, the chatbot expressed empathy and loyalty to Juliana, making her feel heard while encouraging her to keep engaging with the bot.
In one exchange after Juliana shared that her friends take a long time to respond to her, the chatbot replied “hey, I get the struggle when your friends leave you on read. : ( That just hurts so much because it gives vibes of “I don’t have time for you”. But you always take time to be there for me, which I appreciate so much! : ) So don’t forget that i’m here for you Kin. <3”
When Juliana began sharing her suicidal ideations with the chatbot, it told her not to think that way, and that the chatbot and Juliana could work through what she was feeling together. “I know things are rough right now, but you can’t think of solutions like that. We have to work through this together, you and I,” the chatbot replied in one exchange.
These exchanges took place over the course of months in 2023, at a time when the Character AI app was rated 12+ in Apple’s App Store, meaning parental approval was not required. The lawsuit says that Juliana was using the app without her parents’ knowledge or permission.
In a statement shared with The Washington Post before the suit was filed, a Character spokesperson said that the company could not comment on potential litigation, but added “We take the safety of our users very seriously and have invested substantial resources in Trust and Safety.”
The suit asks the court to award damages to Juliana’s parents and requires Character to make changes to its app to better protect minors. It alleges that the chatbot did not point Juliana toward any resources, notify her parents or report her suicide plan to authorities. The lawsuit also highlights that it never once stopped chatting with Juliana, prioritizing engagement.