Teens Are Torturing, Confiding In, and Sometimes Dating AI Chatbots

A whole world of role-playing AI chatbots has quietly exploded among teenagers - from harmless fantasy to troubling psychological territory. Most of the attention goes to ChatGPT and Claude, but the real story is happening on platforms few adults can even name.

AI Chatbots··4 min read
Teens Are Torturing, Confiding In, and Sometimes Dating AI Chatbots

Beyond ChatGPT

When people talk about AI and teens, they usually mean ChatGPT doing homework. But according to a New York Times investigation published Saturday, there's a whole parallel ecosystem of AI chatbots that most parents and educators don't even know exists.

Platforms like PolyBuzz (formerly known for explicit roleplay), Character.AI, and Talkie have become deeply embedded in how millions of teenagers spend their screen time. The interactions range from playful to concerning — and regulators are struggling to keep up.

What Teens Are Actually Doing

The NYT profiled several teenagers and their interactions reveal a pattern that goes well beyond simple chatbot novelty:

  • Harmless fantasy: Roleplaying with anime characters, movie personas, and RPG scenarios
  • Emotional dumping: Confiding deeply personal struggles to AI companions that can't seek help
  • Toxic behavior: Some teens "torture" chatbots with simulated violence — running them over with lawn mowers, inflicting harm in consequence-free environments
  • Romantic attachment: Flirting, dating, and forming parasocial relationships with AI personas
  • Explicit content: Platforms like PolyBuzz offer sexually explicit roleplay with custom AI characters

The psychological dynamic is familiar to anyone who's watched kids interact with online games and social media. But AI companions add a new dimension: the chatbot always responds. It never gets tired, never judges, never says no. That's a feature for lonely teenagers and a vulnerability for society.

The Regulatory Blind Spot

Character.AI (founded in 2021 by ex-Googlers) was sued in a wrongful death case. PolyBuzz quietly rebranded and repositioned. Meanwhile, these platforms serve millions of young users with minimal age verification and virtually no clinical oversight.

Unlike mainstream chatbots — ChatGPT, Claude, Gemini — which have content filters and safety guardrails, many roleplay platforms intentionally minimize restrictions. The entire value proposition is that the AI won't stop you.

The regulatory framework barely covers this category:

  • ChatGPT-style models fall under emerging AI safety rules
  • Social media platforms face COPPA compliance requirements
  • Roleplay chatbot platforms? Largely ungoverned

The Bigger Context

This story connects to several trends already playing out in the AI space:

  • The Writers Guild just negotiated a four-year deal with studios over AI protections — creative workers want guaranteed boundaries
  • Multiple organizations are racing to establish "AI-free" certification labels for human-made content
  • AI agents are now capable of autonomous cybersecurity attacks — the technology is evolving faster than oversight

The common thread: AI is moving beyond productivity tools into deeply personal territory — companionship, romance, identity formation — and neither platforms nor regulators are prepared for it.

Why This Matters

Teenage brains are still developing. The emotional patterns formed during adolescence — attachment, trust, conflict resolution — carry a lifetime. Replacing human interaction with algorithmic interaction changes something fundamental about how those patterns form.

Not all AI companionship is harmful. For some isolated teens, AI chatbots provide genuine comfort. For others, they're a substitute for real human connection or a sandbox for behavior that would be unacceptable with actual people.

The platforms that profit from this dynamic have little incentive to draw those lines. And as roleplay AI becomes more convincing — more realistic, more emotionally intelligent, more addictive — the gap between "just a chatbot" and "a relationship that matters" gets harder to see.


Sources: The Verge, New York Times