Mental health app Wysa raises $5.5M for ’emotionally intelligent’ AI – TechCrunch

by Joseph K. Clark

It’s hard enough to talk about your feelings to a person; Jo Aggarwal, the founder, and CEO of Wysa, hopes you’ll find it easier to confide in a robot. Or, put more specifically, “emotionally intelligent” artificial intelligence.

Wysa is an A. I powered mental health app designed by Touchkin eServices, Aggarwal’s company that currently maintains headquarters in Bangalore, Boston, and London. Wysa is a chatbot that can respond with words of affirmation or guide a user through one of 150 therapeutic techniques.

Wysa is Aggarwal’s second venture. She says the first was an elder care company that failed to find the market fit. Aggarwal fell into a deep depression, from which, she says, the idea of Wysa was born in 2016. 

In March, Wysa became one of 17 Google Assistant Investment Program apps. In May, a Series, A funding round of $5.5 million was closed, led by Boston’s W Health Ventures, the Google Assistant Investment Program, pi Ventures, and Kae Capital. Wysa has raised $9 million in funding, says Aggarwal, and the company has 60 full-time employees and about three million users. 

Wysa is primarily aimed at people who want to vent. She says most Wysa users are there to improve their sleep, anxiety, or relationships. The ultimate goal, she says, is not to diagnose mental health conditions.

Mental health

“Out of the 3 million people that use Wysa, only about 10% need a medical diagnosis,” says Aggarwal. If a user’s conversations with Wysa equate with high scores on traditional depression questionnaires like the PHQ-9 or the anxiety disorder questionnaire GAD-7, Wysa will suggest talking to a human therapist. 

You don’t need a clinical mental health diagnosis to benefit from therapy. Wysa isn’t intended to be a replacement, says Aggarwal  (whether users view it as a replacement remains to be seen), but an additional tool that users can interact with daily. 

“60 percent of the people who come and talk to Wysa need to feel heard and validated, but if they’re given self-help techniques, they can work on it themselves and feel better,” Aggarwal continues. Aggarwal says that Wysa’s approach has been refined through conversations with users and therapist input. 

For instance, while conversing with a user, Wysa will first categorize their statements and then assign a type of therapy based on those responses, like cognitive behavioral therapy or acceptance and commitment therapy. It would then select a line of questioning or therapeutic technique written ahead of time by a therapist and converse with the user. Wysa, says Aggarwal, has been gleaning insights from over 100 million conversations unfolding this way. 

“Take, for instance, a situation where you’re angry at somebody else. Originally our therapists would come up with the empty chair technique, where you’re trying to look at it from the other person’s perspective. We found that when a person felt powerless or had trust issues, like teens and parents, the techniques the therapists gave weren’t working,” she says. 

“There are 10,000 people facing trust issues refusing to do the empty chair exercise. So we have to find another way of helping them. These insights have built Wysa.”

Although Wysa has been refined, research institutions have played a role in Wysa’s ongoing development. Pediatricians at the University of Cincinnati helped develop a module specifically targeted toward COVID-19 anxiety. There are also ongoing studies of Wysa’s ability to help people cope with mental health consequences from chronic pain, arthritis, and diabetes at Washington University in St. Louis and The University of New Brunswick. 

Still, Wysa has had several tests in the real world. In 2020, the government of Singapore licensed Wysa and provided the service for free to help cope with the emotional fallout of the coronavirus pandemic. Wysa is also offered through Aetna’s health insurance company to supplement Aetna’s Employee Assistance Program. 

The biggest concern about mental health apps is that they might accidentally trigger an incident or mistake signs of self-harm. The UK’s National Health Service (NHS) offers specific compliance standards to address this. Wysa complies with the NHS’ DCB0129 standard for clinical safety, the first AI-based mental health app to earn the distinction. 

To meet those guidelines, Wysa appointed a clinical safety officer and was required to create “escalation paths” for people who showed signs of self-harm. Wysa, says Aggarwal, is also designed to flag responses to self-harm, abuse, suicidal thoughts, or trauma. If a user’s responses fall into those categories, Wysa will prompt the user to call a crisis line.

In the US, the Wysa app that anyone can download, says Aggarwal, fits the FDA’s definition of a general wellness app or a “low-risk device.” That’s relevant because, during the pandemic, the FDA has created guidance to accelerate the distribution of these apps. 

Still, Wysa may not perfectly categorize each person’s response. A 2018 BBC investigation, for instance, noted that the app didn’t appear to appreciate the severity of a proposed underage sexual encounter. Wysa responded by updating the app to handle more coercive cases of sex. 

Aggarwal also notes that Wysa contains a manual list of sentences, often slang, that they know the AI won’t catch or accurately categorize as harmful. Those are manually updated to ensure that Wysa responds appropriately. “Our rule is that [the response] can be 80% appropriate but 0% triggering,” she says. 

In the immediate future, Aggarwal says the goal is to become a full-stack service. Rather than referring patients who receive a diagnosis to Employee Assistant Programs (as the Aetna partnership might) or outside therapists, Wysa aims to build its network of mental health suppliers. On the tech side, they’re planning an expansion into Spanish and will start investigating a voice-based system based on guidance from the Google Assistant Investment Fund.

Related Posts