// The Problem
The teachers, counselors, and administrators in a school system are on the frontlines in supporting the mental health of young people. The problem is they’re often underequipped and overwhelmed. Gao said that the average counselor-to-student ratio in the United States is 1:385 (one counselor serves an average of 385 students). Meanwhile, 9/10 teachers don’t feel equipped to address mental health issues and respond effectively. These adults interact with students every day, but they often lack the right resources at the right time.
// The Solution
Lenny is an AI-enabled behavioral health platform. It provides a toolkit of lessons, interventions, family engagement, and analytics that are aligned with proven research and best practices.
Teachers and counselors can log in to the platform and explain to the AI-powered system the issue or challenge they’re facing. Trained on a library of evidence-based methodologies, Lenny creates a set of personalized resources and tools for the educator. It delivers evidence-based lesson plans, creates targeted interventions, and assesses student needs.
// The Model & Traction
Lenny is a nonprofit, but it sells access to its platform to school districts nationwide. It raises philanthropic funding to provide the platform to underresourced districts at no cost. It also offers free access to the platform for individual educators, allowing anyone to sign in and try it for free.
Lenny grew by 10x in 2024 alone. It is in 700 schools in every state, reaching over 300,000 students. Gao and Bjork's vision is to be in every school in the country by 2030.
Koko
// Personal Story
Dr. Rob Morris, the founder of Koko, has always been fascinated by psychology and the study of the mind. When he was pursuing his PhD at MIT focused on digital mental health, he found he was struggling with his own mental health. So he started to build a tool just for himself. “It was bespoke to me. If I could help myself, maybe I could build something that would help others, too,” he said. That was the genesis of Koko. Today, Koko has reached over 4,000,000 people by working with digital platforms to provide free, evidence-based mental health interventions.
// The Problem
The results from a survey by the CDC in 2023 highlighted how we're in the midst of a youth mental health crisis in the United States.
- One in three students experiences poor mental health most of the time.
- One in three students has felt persistent sadness or hopelessness for 2 weeks or more during a 12-month period.
- One in five students has seriously considered attempting suicide during a 12-month period.
- One in 10 students has attempted suicide during a 12-month period.
Meanwhile, American teens ages 13-18 spend an average of 5.6 hours per day on their phones.
// The Solution
The philosophy behind Koko is to be pragmatic. “I don't think it's realistic that we're going to shut these platforms down entirely,” Dr. Morris said. “You have to go where the millions of eyeballs already are.”
This means that Koko’s tech adapts to the specific architecture of each social platform. If a user is searching for content on a social media platform that suggests they might need help (such as searching for pro-anorexia content), Koko uses AI to work with the platform to suppress that content and redirect the user to evidence-based resources.
- On Discord, the model is different. Over 24,000 Discord servers have installed Koko’s AI bot, allowing Discord users on that server to chat with a Koko chatbot and get support.
- On Instagram, users might come across a Koko video in their feed. When they click on the video, it opens a Whatsapp chat where users can interact with Koko resources.
- On TikTok, Koko uses hundreds of thousands of dollars of ad credits provided by the platform to direct young people to mental health services.
// The Future
Today, a young person completes an intervention via Koko every 90 seconds (an intervention is considered an instance when a user completes a course, receives peer support, or is connected to a crisis line). Dr. Morris's goal is to scale to a point where someone completes an intervention every second of every day. So far, users have completed 738,000 interventions. To scale, Koko is adding new platforms and expanding its work with university partners (it has completed multiple randomized control trials at MIT).
// Building a Responsible Tech Ecosystem
One hallmark of a responsible AI company (nonprofit or otherwise) is their relationship to data—both in what they collect and what they don’t. In our conversations with Lenny Learning and Koko, what data they chose not to collect was indicative of their values around data privacy.
- Lenny intentionally doesn’t collect any student data on its platform. They only interact with adult educators.
- Koko’s data collection is minimal: The only data collected automatically is the social platform the user came from. Koko doesn’t know the IP address, username, or search history. Data privacy is just one of their ethical commitments.
Lenny and Koko are two of dozens of AI nonprofits in Fast Forward’s portfolio and in the Project Liberty Alliance that are building a safer, better internet.
We’d love to hear from you. What organizations are at the forefront of responsible AI development? Who is responsibly using AI to build The People’s Internet? |
|