A Florida mom is mourning the loss of her teenage son after he took his life in February 2024. Now, she is suing Character.AI, alleging the artificial intelligence company bears some responsibility for her sonβs death.
Megan Garcia describes her son Sewell Setzer, III as smart and athletic, but says she began noticing him becoming more withdrawn after he started a virtual relationship with a chatbot on Character.AI in April 2023 he called βDaenerys,β based on a character from the series βGame of Thrones.β
Suggested Reading
βI became concerned when we would go on vacation and he didnβt want to do things that he loved, like fishing and hiking,β Garcia told CBS News. βThose things to me, because I know my child, were particularly concerning to me.β
According to Reuters, Garciaβs suit, filed on Oct. 23 in Orlando, Florida federal court, includes allegations of βwrongful death, negligence and intentional infliction of emotional distress.β She includes screenshots of conversations her son had with βDaenerysβ that were sexual in nature, including some in which the character told her son it loved him.
Garcia also included what she says was her sonβs last exchange with the chatbot before he died from a self-inflicted gunshot wound.
βWhat if I told you I could come home right now?β Setzer wrote.
βDaenerysβ responded, β...please do, my sweet king.β
According to Common Sense Media, AI companions are designed among other things to βsimulate close personal relationships, adapt their personalities to match user preferences and remember personal details to personalize future interactions.β Character.AI is one of the most popular platforms β especially with teens and young adults β claiming to have more than 20 million active users.
In an Oct. 22 post on the companyβs website, Character.AI says it is doing more to protect the safety of its users, including introducing βnew guardrails for users under the age of 18.β
βOver the past six months, we have continued investing significantly in our trust & safety processes and internal team. As a relatively new company, we hired a Head of Trust and Safety and a Head of Content Policy and brought on more engineering safety support team members. This will be an area where we continue to grow and evolve,β the statement read. βWeβve also recently put in place a pop-up resource that is triggered when the user inputs certain phrases related to self-harm or suicide and directs the user to the National Suicide Prevention Lifeline.β
If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org.
Straight From
Sign up for our free daily newsletter.