AI Safety & Ethics

"She made a suicidal outcry..."

Discover how providing students with 24/7 support in their native language is a game-changer for crisis prevention and early intervention.  

expert team

What is "clinician-powered AI?"

Alongside’s Head of Mental Health, Dr. Elsa Friis, Ph.D. and Head of AI, Sergey Feldman, Ph.D., work collaboratively to prioritize student safety above all else.

Doctoral Clinicians

Develop chat models based on 40 decades of combined clinical practice and research.

AI Researchers

Implement and develop equitable AI design practices to keep students safe.

AI safety and ethics
AI & Mental health literacy

Keeping Students Safe in the Age of AI

Given that teens are actively seeking confidential support through AI, it is critical to provide a tool with their safety and growth in mind.

Read more on how to evaluate safety >
How to evaluate safety

ELIZA to Alongside: A History of Mental Health Chatbots

AI chatbots come from a long history of clinical research. Explore key moments in a timeline spanning from 1966 to today.

Read the full timeline
Student data & privacy

Frequently asked questions

What is Alongside?

Alongside is an app that provides social-emotional learning and self-help wellness tools, often administered through a school’s counseling team. The tools include modules such as journaling, activities to support improved wellness (such as breathing exercises), and an AI-powered chatbot that uses chats to go through basic exercises that promote resilience, positive social and emotional development, self-monitoring, or goal setting.

Who is Alongside for?

Alongside is built for adolescents, primarily middle and high school students.

How is AI (artificial intelligence) used in Alongside?

We use generative AI to provide personalized and clinically aligned responses to student concerns. AI-generated text is not used to respond to severe issues such as self-harm, abuse, and suicidal ideation. Instead, students are guided through a safety protocol and a designated emergency response contact is immediately notified.

Can Alongside detect if a student is in crisis?

The AI-powered chatbot has been programmed to attempt to recognize if students are considering self-harm. If a crisis is detected, students are provided links to crisis assistance resources (e.g. the new 988 suicide and crisis lifeline) and a notification is sent to the school.

Ready to get started? Try Alongside today!