Sonia’s AI chatbot steps in for therapists
Can chatbots replace human therapists? Some startups â and patients â claim that they can. But itâs not exactly settled science.
One study found that 80% of people whoâve used OpenAIâs ChatGPT for mental health advice consider it a good alternative to regular therapy, while a separate report found that chatbots can be effective in reducing certain symptoms related to depression and anxiety. On the other hand, itâs well-established that the relationship between therapist and client â the human connection, in other words â is among the best predictors of success in mental health treatment.
Three entrepreneurs â Dustin Klebe, Lukas Wolf and Chris Aeberli â are in the pro-chatbot therapy camp. Their startup, Sonia, offers an âAI therapistâ that users can talk to or text via an iOS app about a range of topics.
âTo some extent, building an AI therapist is like developing a drug, in the sense that we are building a new technology as opposed to repackaging an existing one,â Klebe, Soniaâs CEO, told TechCrunch in an interview.
The three met in 2018 while studying computer science at ETH ZĂŒrich and moved to the U.S. together to pursue graduate studies at MIT. Shortly after graduating, they reunited to launch a startup that could encapsulate their shared passion for scalable tech.
That startup became Sonia.
Sonia leverages a number of generative AI models to analyze what users say during âtherapy sessionsâ in the app and respond to them. Applying techniques from cognitive behavioral therapy, the app, which charges users $20 per month or $200 per year, gives âhomeworkâ aimed at driving home insights from conversations and visualizations designed to help identify top stressors.
Klebe claims that Sonia, which hasnât received FDA approval, can tackle issues ranging from depression, stress, and anxiety to relationship problems and poor sleep. For more serious scenarios, like people contemplating violence or suicide, Sonia has âadditional algorithms and modelsâ to detect âemergency situationsâ and direct users to national hotlines, Klebe says.
Somewhat alarmingly, none of Soniaâs founders have backgrounds in psychology. But Klebe says that the startup consults with psychologists, recently hired a cognitive psychology graduate, and is actively recruiting a full-time clinical psychologist.
âIt is important to emphasize that we donât consider human therapists, or any companies providing physical or virtual mental health care conducted by humans, as our competition,â Klebe said. âFor every response that Sonia generates, there are about seven additional language model calls happening in the background to analyze the situation from several different therapeutic perspectives in order to adjust, optimize and personalize the therapeutical approach chosen by Sonia.â
What about privacy? Can users rest assured that their data isnât being retained in a vulnerable cloud or used to train Soniaâs models without their knowledge?
Klebe says Sonia is committed to storing only the âabsolute minimumâ amount of personal information to administer therapy: a userâs age and name. He didnât address where, how, or for how long Sonia stores conversation data, however.
Sonia, which has around 8,000 users and $3.35 million in backing from investors including Y Combinator, Moonfire, Rebel Fund and SBXi, is in talks with unnamed mental health organizations to provide Sonia as a resource through their online portals. The reviews for Sonia on the App Store are quite positive so far, with several users noting they find it easier to speak with the chatbot about their issues than a human therapist.
But is that a good thing?
Todayâs chatbot tech is limited in the quality of advice it can give â and it might not pick up on subtler signs indicative of a problem, like an anorexic person asking how to lose weight. (Sonia wouldnât even know the personâs weight.)
Chatbotsâ responses are also colored with biases â often the Western biases reflected in their training data. As a result, theyâre more likely to miss cultural and linguistic differences in the way a person expresses mental illnesses, particularly if English is that personâs second language. (Sonia only supports English.)
In the worst-case scenario, chatbots go off the rails. Last year, The National Eating Disorders Association came under fire for replacing humans with a chatbot, Tessa, that dispensed weight-loss tips triggering to people with eating disorders.
Klebe emphasized that Sonia isnât trying to replace human therapists.
âWe are building a solution for the millions of people who are struggling with their mental health but canât (or donât want to) access a human therapist,â Klebe said. âWe aim to fill the gigantic gap between demand and supply.â
Thereâs certainly a gap â both in terms of the ratio of professionals to patients and the cost of treatments versus what most patients can afford. More than half of the U.S. doesnât have adequate geographic access to mental care, according to a recent government report. And a recent survey found that 42% of U.S. adults with a mental health condition werenât able to receive care because they couldnât afford it.
A piece in Scientific America talks about therapy apps that cater to the âworried well,â or people who can afford therapy and app subscriptions, and not isolated individuals who might be most at risk but donât know how to seek help. At $20 per month, Sonia isnât exactly cheap â but Klebe argues itâs cheaper than a typical therapy appointment.
âItâs a lot easier to start using Sonia than seeing a human therapist, which entails finding a therapist, being on the waitlist for four months, going there at a set time and paying $200,â he said. âSonia has already seen more patients than a human therapist would see over the course of their entire career.â
I only hope that Soniaâs founders remain transparent about the issues that the app can and cannot address as they build it out.