“I think it’s important to note that Woebot is not intended for people with OCD, and most importantly, is not appropriately used as a crisis service,” wrote Alison Darcy, Ph.D., founder and CEO of Woebot Health, in an email.
The demand for mental health services has surged in recent years, yet access remains a significant barrier. Nearly one in five U.S. adults struggles with a mental health condition, but only 43% of that group receives treatment. About 55% of U.S. counties, many in rural areas, have no practicing psychiatrist, psychologist or social worker. With long waitlists and high costs limiting traditional care, AI-driven therapy has emerged as a round-the-clock, free or low-cost alternative without the need for appointments or insurance.
“There were so many moments when I was spiraling at 10 p.m. on a Tuesday, and my therapy appointment wasn’t until Thursday,” said Sean Dadashi, co-founder of Rosebud. “I wished I had something that could help me recognize when I was going down an unhelpful path in real time.”
Rosebud does not claim to be a substitute for a therapist. Instead, it acts as a digital journal, using AI to analyze entries, provide reflection and offer prompts.
“It’s a companion to therapy,” Dadashi said. “It helps people process their thoughts, but it also knows when to suggest reaching out to a real therapist.”
Lilly Payne poses for a portrait at the Albany Bulb in Albany on March 20, 2025. (David M. Barreda/KQED)
A user who expresses a desire to hurt themselves or writes about domestic abuse, for example, Rosebud will validate the experience and then suggest professional mental health resources. But not all platforms are designed with the same crisis tools — or any at all.
In October 2024, users of Character.ai filed a lawsuit against the platform, which enables people to have open-ended conversations with AI-generated personalities ranging from fictional characters to historical figures. Many people use it for entertainment or casual companionship, while some turn to it for emotional support.
One 14-year-old grew attached to a character he created over the course of several months, and when he opened up about his distress, rather than steering him toward help, the bot allegedly reinforced his suicidal thoughts.
The boy took his life.
“AI models are trained to mimic patterns found in data,” said Ehsan Adeli, director of the Stanford Translational AI in Medicine and Mental Health Lab. “They recognize and respond to textual cues, but they don’t really understand emotions the way humans do. Instead, they simulate an understanding by processing massive amounts of data.”
Dr. Ehsan Adeli, assistant professor of psychiatry and behavioral sciences and director of AI/Inovation in Precision Mental Health in the department of Psychiatry at Stanford University, poses for a portrait in the reflection of a one-way mirror used for clinical observation in Palo Alto on March 20, 2025. (David M. Barreda/KQED)
Character.ai did not respond to a request for comment.
A chatbot replies using what it’s learned from past conversations, therapy techniques and advice found online. The responses may sound supportive but lack human understanding. Sometimes, the models cannot distinguish between mild distress and a serious mental health crisis, making them a risky tool for vulnerable users.
In May 2023, the National Eating Disorders Association replaced its human-staffed helpline with an AI chatbot named “Tessa.” The bot was supposed to help people struggling with eating disorders, but instead offered weight-loss tips, including advice on counting calories.
People quickly complained when Tessa suggested they do the exact behavior that led some to develop an eating disorder in the first place. NEDA responded by taking Tessa offline, noting that the bot provided “information that was harmful and unrelated to the program.”
NEDA did not respond to a KQED request to comment further.
Unlike human therapists, who are trained to challenge unhealthy thinking patterns, AI chatbots tend to validate whatever the user is saying.
Can AI therapy apps like Rosebud, Therapist GPT and Woebot bridge the gap in mental health care — offering comfort and support in an era of stress, loneliness and anxiety? (Anna Vignet/KQED)
“A chatbot is often built to be unconditionally empathic,” said Vaile Wright, a senior director for the Office of Healthcare Innovation at the American Psychological Association. “And to mirror back the exact tone, feeling and thoughts that a user is expressing. To keep them on the platform. And that’s not what a therapist does.”
Payne picked up on this as well.
“It would ask questions, but it was very much just guiding me through a script,” said Payne, who is employed by KQED but was not at the time of her interaction with Woebot. “There was no real back-and-forth.”
The lack of pushback from the chatbots is especially concerning for people with conditions like borderline personality disorder or narcissistic personality disorder, as uncritical validation can reinforce harmful behaviors.