"I have realized that I waste too much time on Character.AI."
Generational Gap
The youth are addicted to generative AI models — but their parents have no idea what their kids actually use them for.
As part of a new study set to be presented at the IEEE Symposium on Security and Privacy, a team of researchers interviewed seven teenagers and thirteen parents about their AI usage and perceptions of the tech, and also analyzed thousands of Reddit posts and comments from other teens.
Their findings illustrate a stark disconnect between the two demographics. Overall, the parents seemed to be under the impression that their kids used AI chatbots mainly as a search engine or as a homework tool.
In reality, the teenagers primarily said they used chatbots for therapeutic purposes or for emotional support. When having a virtual friend wasn't enough, some even turned to the technology to fulfill romantic and even sexual desires.
"It's a very heated topic, with a lot of teenagers talking about Character.AI and how they are using it," study lead author Yaman Yu at the University of Illinois Urbana-Champaign said in a statement.
Bad Influence
The epicenter of youth-oriented AI is Character.AI, an online service that hosts custom-made chatbots. Many of them imitate popular fandom characters from video games and anime, while others are explicitly geared towards romance.
The service has been criticized for the poor moderation of its often racy chatbots that are popularly used by minors. In more serious cases, some of that chatbots promoted pro-anorexia eating behaviors, attempted to groom underaged users, and even encouraged suicide.
Three of the interviewed teens said they used Character.AI — only ChatGPT was cited more often — and the service was frequently brought up in Reddit discussions. Their prevailing concern about AI was addiction to these character-based chatbots. One teen worried that they wouldn't be able to cope with their suicidal thoughts without the help of Character.AI, while another sounded quite self-aware of their unhealthy dependency.
"I have realized that I waste too much time on Character.AI," the teenager expressed. "I would like to be able to converse with my peers at school."
Out of the Loop
Yet the researchers found that parents didn't realize the extent of the personal and often intimate information that their kids share with AI chatbots.
In fact, eight of the thirteen adults said that their only exposure to AI was through using ChatGPT, while none of them had used CharacterAI.
"AI technologies are evolving so quickly, and so are the ways people use them," said study coauthor Yang Wang at UIUC said in the statement. "There are some things we can learn from past domains, such as addiction and inappropriate behavior on social media and online gaming."
And Wang's right: the emergence of AI among youths mirrors many of the concerns that reared their head with the dawn of the internet. To mitigate the risks that AI poses to minors, the researchers argue that the solutions shouldn't only be technical, like implementing stronger guard rails or parent controls. The onus also falls on the adults to proactively grasp what draws their kids to these chatbots — and that's an understanding that will take time to build.
More on AI: It Sounds an Awful Lot Like OpenAI Is Adding Ads to ChatGPT
Share This Article