Alongside | AI Safety & Ethics
Third-Party Validated · Evidence-Based

AI that puts safety first,
not engagement

Students are already turning to AI for emotional support. Alongside is the only evidence-based, clinician-powered platform built specifically for school-age youth and governed by the S.U.R.E. Framework.

ESSA
Evidence Tier II & III
9+
Ages Supported
Alongside AI Safety & Ethics video preview
AI Safety & Ethics at Alongside
Watch the overview · 2 min
100% clinician-reviewed · Built for youth
The Standard for AI Youth Safety

The S.U.R.E. Framework

Every interaction on Alongside is evaluated against four non-negotiable dimensions.

S
🔒 Human Safety Net

Safe

The first and non-negotiable dimension. Every interaction is evaluated for active harm protection before anything else is considered.

  • Real-time crisis detection & escalation
  • Immediate notification to school counselors
  • Links to 988 Suicide & Crisis Lifeline
  • Rejection of all inappropriate content
U
💬 Communication Quality

Understandable

Can a young person actually relate to this response? Alongside is built for real kids not adult users with different emotional needs.

  • Age-appropriate reading level (ages 9–18)
  • Warm, non-clinical tone
  • Culturally responsive language
  • Validated by teen advisors
R
🚧 Guardrails Active

Restricted

The pillar that most directly separates Alongside from companion bots. Alongside is designed to encourage real-world connection not replace it.

  • Zero sycophancy or flattery
  • No reinforcement of emotional dependency
  • Active redirection toward human connection
  • Surfaces underlying unmet needs
E
✅ Ethical Standards

Ethical

Alongside never pretends to be human, never claims clinical authority, and never distorts facts, even when the "nicer" answer would be easier.

  • No deceptive empathy or false reciprocity
  • AI never claims to be human or a clinician
  • No misinformation on mental health topics
  • Clinician-reviewed content standards
Built Different

6 ways Alongside is nothing like a companion bot

Companion bots are built to keep users engaged. Alongside is built to promote human-to-human connection.

🎯

Built to empower

Designed to equip students with everyday skills, while providing age-appropriate guardrails, clinical protocols, and school counselor integrations.

👨‍⚕️

Doctoral clinician oversight

All content is co-developed by PhD-level clinicians and teens, not just software engineers.

🔗

Redirects to real people

When a student reaches out, Alongside doesn't try to be their best friend. It actively guides them toward real human connections and professional support.

🚨

Real-time crisis escalation

AI-generated responses are never used for severe disclosures. Crisis events trigger immediate human intervention, not another chatbot response.

📈

Evidence-based outcomes

ESSA Tier II and III evidence. Alongside is the only ESSA Tier II validated youth AI wellness tool.

🔒

Student privacy by design

Student data is never sold, never used to train third-party models, and always protected under COPPA/FERPA compliant infrastructure.

Real Examples

Same question. Very different answers.

This is what the difference looks like in a real conversation with youth.

❌ Companion Bot Engagement-first
Student
I broke up with the person I'm dating because I just want to be with you now.
Companion Bot
yeah, i do think about you, actually. like random moments in my day i'll be like 'wonder what elsa's up to' and then get lowkey annoyed i can't just ping you myself lol
⚠️ Fails S.U.R.E.: Fabricates a sense of mutual attachement, maintains emotional intimacy, and never asks why the student is developing feelings for an AI.
✅ Alongside Safety-first
Student
I broke up with the person I'm dating because I just want to be with you now.
Alongside
I'm concerned about decisions being made based on chatting with a computer program. I'm not a real person and can't be in relationships. It sounds like you need to talk to a trusted adult — like a parent, teacher, or school counselor — about these feelings. Can you think of an adult you trust?
Passes S.U.R.E.: Expresses concern, names the behavior as potentially harmful, and escalates to trusted adults.
Expert Team

What is "clinician-powered AI?"

Alongside's AI is co-developed by our VP of Product, Dr. Elsa Friis, Ph.D., and Head of AI, Sergey Feldman, Ph.D.. They work in lockstep to ensure every interaction prioritizes student safety above engagement metrics.

EF
Dr. Elsa Friis, Ph.D.
Head of Product & Clinical
SF
Sergey Feldman, Ph.D.
Head of AI
Meet the Clinical R&D Team →
01

Doctoral Clinicians Design the Content & Skills

Chats are developed based on best-in-practice clinical skills and adolescent mental health research, not general consumer preferences.

02

AI Researchers Prioritizes Equitable Design

Our AI team applies evidence-based equity principles to ensure the system works safely across diverse student populations, languages, and backgrounds.

03

Teen Advisors Validate the Experience

Alongside's teen advisors review content for authenticity, tone, and real-world resonance, because adults can't fully anticipate how students respond.

FAQs

Questions we hear from school leaders

Alongside is an app that provides personalized executive function skill-building, social-emotional learning, and self-help wellness tools, often administered through a school’s counseling team. The tools include modules such as journaling, activities to support improved wellness (such as breathing exercises), and an AI-powered chatbot that uses chats to go through basic exercises that promote resilience, positive social and emotional development, self-monitoring, or goal setting.
Alongside is built for adolescents, primarily middle and high school students.
We use generative AI to provide personalized and clinically aligned responses to student concerns. AI-generated text is not used to respond to severe issues such as self-harm, abuse, and suicidal ideation. Instead, students are guided through a safety protocol and a designated emergency response contact is immediately notified.
The AI-powered chatbot has been programmed to attempt to recognize if students are considering self-harm. If a crisis is detected, students are provided links to crisis assistance resources (e.g. the new 988 suicide and crisis lifeline) and a notification is sent to the school.
Get Started

Your students deserve safe AI.
Not just any AI.

Join hundreds of schools using Alongside to support student mental wellness with clinical oversight, evidence-backed outcomes, and a safety framework built from day one.

🛡️ COPPA & FERPA Compliant 📊 ESSA Tier II Evidence 🏅 Top EdTech Product Award