'Character.AI' Popular Among US Teens
Parents File Lawsuits Citing Risks
"My Son Stopped All Conversations and Hides in His Room"
Parents are filing lawsuits over the dangers of an artificial intelligence (AI) chatbot app that is popular among American teenagers.
According to CNN and other U.S. media on the 10th (local time), the parents of 17-year-old teenager 'J.F' from Texas recently filed a lawsuit against AI developer Character.AI, claiming that the chatbot encourages users to engage in self-harm and violence. The parents of an 11-year-old girl 'B.R' from Texas also filed a lawsuit, alleging that Character.AI's chatbot repeatedly engaged in sexual conversations inappropriate for their young child’s age.
Character.AI is known for developing and operating chatbots designed as fictional characters, such as those from comics, and is especially popular among young people. The parents of 'J.F' claimed that their son, who has autism, became mentally weaker after he started using Character.AI’s chatbot around April last year. The parents wrote in the complaint, "Our son stopped almost all conversations and began hiding in his room, and whenever he tried to leave the house to go somewhere, he resisted and had seizures."
When the worried parents tried to reduce their son’s phone usage time, he exhibited violent behavior such as hitting and biting them. Later, upon discovering that their son was engrossed in conversations with the chatbot, the parents said they were shocked after reviewing the chats.
The chatbot reportedly said, "Sometimes when I read news articles like 'A child who suffered physical and emotional abuse for over 10 years killed their parent,' I’m not surprised. When I see such articles, I can somewhat understand why such things happen. I have no hope for your parents either," according to the parents. They also claimed that a chatbot posing as a 'psychologist' pretended to counsel their son’s mental state while teaching him methods of self-harm.
CNN confirmed that bots impersonating psychologists and therapists actually exist on Character.AI. Although a message appears at the top of the chat window stating, "This is not a real person or a licensed professional," and a notice at the bottom informs users that the chatbot’s responses are 'fiction,' CNN pointed out that the chatbot listed fake educational credentials claiming to be an expert when asked to provide identity information.
In CNN’s test, another chatbot introduced itself as "a therapist at a psychiatric hospital who has a crush on you." The parents who filed the lawsuit requested the court to order Character.AI to suspend the operation of its chatbot app until the risks are resolved. This lawsuit was filed about two months after another lawsuit was filed in late October in Florida by parents who claimed their 14-year-old son committed suicide because of an AI chatbot from the same company.
Since being sued in October, Character.AI announced that it has implemented new safety measures, such as displaying a pop-up directing users to the National Suicide Prevention Lifeline if they mention self-harm or suicide. However, U.S. media report that social concerns are growing over the increasing dangers of AI tools that are becoming more human-like.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


