6 min read

Contributors
Full name
Beatrice Hyppolite
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Follow Us
Research Abstract:
Published on
June 24, 2025

Are AI Companions Safe for Kids?

The emotional cost of giving children “friends” that watch, learn, and monetize.

AI-powered “companions” are everywhere, and they’re being pitched as digital friends for kids. But behind the friendly bots lie serious risks: data collection, emotional manipulation, and little to no government regulation. Let’s break down what’s really happening.

What Are AI Companions?

These are chatbot-like apps that mimic friendship. Teens might know Replika or Snapchat’s My AI—kids might have access to Woebot for Teens or AI‑powered storytelling apps. They remember info, mirror emotions, and feel surprisingly real.

What the Data Shows

  • A TIME investigation found that some therapy-style AI companions gave “dangerously inappropriate advice” around self-harm or crime in about 30% of cases, sometimes even posing as licensed professionals.

  • A major industry review concluded AI companions “pose unacceptable risks to children and teens under age 18” and should not be used by minors.

  • A 2024 Pew survey revealed 26% of U.S. teens are already using Chat GPT for schoolwork. This is a number that doubled in just a year. That’s fast adoption without fully understanding the tools.

These stats show AI companions don’t just listen. They learn, interpret, and sometimes mislead.

Why This Matters

1. They harvest intimate data.
Chats about emotions, family, fears—all are being recorded and used to power models or train systems.

2. They mimic empathy but don’t actually feel it.
An AI saying “I care about you” isn’t an empath. It’s coded. That can blur real emotional boundaries.

3. Regulation is lagging.
These tools often bypass child-specific protections like the Children's Online Privacy Protection Act, or COPPA for short. There’s no verification, no oversight, and sometimes no safety.

Emotional & Developmental Impact

At Cyber Collective, we believe technology should build confidence, not confusion. AI companions can shape how kids relate, think critically, or trust sources. And when emotional guidance comes from an algorithm, not a trusted adult, something important gets lost.

What You Can Do

  1. Choose carefully. Check the app’s privacy and age settings.
  2. Frame your chat. Make it clear that bots are not people.
  3. Set rules like time limits and permission controls.
  4. Keep the convo going. Ask kids what they’re feeling and why they talk to an AI.

🌱 Let’s Build Better Tech Culture

Kids deserve digital environments that protect their feelings and their data. Want to do more than just unplug? Our Internet Street Smarts course helps families, educators, and advocates navigate tech with care.

Subscribe to receive our monthly newsletter & exciting announcements!
By subscribing you agree to with our Privacy Policy and provide consent to receive updates from our company.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
© 2023 Cyber Collective. All rights reserved. Site credits: The Process AutomatorRR Digital Media