Discover how providing students with 24/7 support in their native language is a game-changer for crisis prevention and early intervention.
Alongside’s Head of Mental Health, Dr. Elsa Friis, Ph.D. and Head of AI, Sergey Feldman, Ph.D., work collaboratively to prioritize student safety above all else.
Develop chat models based on 40 decades of combined clinical practice and research.
Implement and develop equitable AI design practices to keep students safe.
AI chatbots come from a long history of clinical research. Read more to stay informed about today's rapid development of AI tools for mental health.
Alongside is an app that provides social-emotional learning and self-help wellness tools, often administered through a school’s counseling team. The tools include modules such as journaling, activities to support improved wellness (such as breathing exercises), and an AI-powered chatbot that uses chats to go through basic exercises that promote resilience, positive social and emotional development, self-monitoring, or goal setting.
We use generative AI to provide personalized and clinically aligned responses to student concerns. AI-generated text is not used to respond to severe issues such as self-harm, abuse, and suicidal ideation. Instead, students are guided through a safety protocol and a designated emergency response contact is immediately notified.
The AI-powered chatbot has been programmed to attempt to recognize if students are considering self-harm. If a crisis is detected, students are provided links to crisis assistance resources (e.g. the new 988 suicide and crisis lifeline) and a notification is sent to the school.