Character.AI to End Open-Ended Chats for Teen Users by November 25
The decision comes after growing scrutiny from regulators and safety experts
Character.AI has announced that by November 25, users under 18 will no longer be able to engage in open-ended conversations with AI Characters. In a blog post from earlier today, Character.AI said it wants to build a “safer, creative experience” for younger users.
During the transition, teen users will have a daily chat limit of two hours, which will gradually decrease before the full restriction takes effect. The company plans to replace chat-based interactions with creative tools that let teens make videos, stories, and streams with AI Characters instead.
To ensure the right experience for each age group, Character.AI will also introduce new age assurance tools. It will combine the company’s in-house verification model with third-party services like Persona. These tools will identify under-18 users better and apply appropriate safeguards.
Besides these safety measures, the company is also launching the AI Safety Lab. That’s an independent nonprofit focused on developing safety frameworks for AI-driven entertainment. The lab will collaborate with researchers, academics, and policymakers to explore new techniques and share findings across the industry.
Character.AI says the decision comes after growing scrutiny from regulators and safety experts over how teens interact with AI chatbots. While acknowledging that most young users follow content rules, the company says it is “prioritizing teen safety over convenience.”
Read our disclosure page to find out how can you help Windows Report sustain the editorial team. Read more
User forum
0 messages