AI Company Bans Chatbots for Users Under 18

The company is facing multiple lawsuits from parents alleging that their children committed suicide after using its chatbots.
AI Company Bans Chatbots for Users Under 18
In this photo illustration, a teenager uses a phone to access apps in New York City, on Jan. 31, 2024. illustration by Spencer Platt/Getty Images
|Updated:
0:00
The Character.AI platform, which offers AI companions to users, will block minors from accessing its chatbots “to keep teen users safe” on the platform, the company said in an Oct. 29 statement.

“We will be removing the ability for users under 18 to engage in open-ended chat with AI on our platform. This change will take effect no later than November 25,” the California-based company said. “During this transition period, we also will limit chat time for users under 18. The limit initially will be two hours per day and will ramp down in the coming weeks before November 25.”