It’s hard enough to talk about your feelings to a person; Jo Aggarwal, the founder, and CEO of Wysa, hopes you’ll find it easier to confide in a robot. Or, put more specifically, “emotionally intelligent” artificial intelligence.
Wysa is an A. I app designed by Touchkin eServices, Aggarwal’s company that currently maintains headquarters in Bangalore, Boston, and London. Wysa is a chatbot that can respond with words of affirmation or guide a user through one of 150 therapeutic techniques.
Wysa is Aggarwal’s second venture. She says the first was an elder that failed to find the market fit. Aggarwal fell into a deep depression, from which, she says, the idea of Wysa was born in 2016.
In March, Wysa became one of 17 apps. In May, a funding round of $5.5 million was closed, led by Boston’s W Health Ventures, the Google Assistant Investment Program, pi Ventures, and Kae Capital. Wysa has raised $9 , says Aggarwal, and the company has 60 full-time employees and about three million users.
Wysa is primarily aimed at people who want to vent. She says most Wysa users are there to improve their sleep, anxiety, or relationships. The ultimate goal, she says, is not to diagnose mental health conditions.
“Out of the 3 that use Wysa, only about 10% need a medical diagnosis,” says Aggarwal. If a user’s conversations with Wysa equate with high scores on traditional depression questionnaires like the or the anxiety disorder questionnaire , Wysa will suggest talking to a human therapist.
You don’t need a clinical from therapy. Wysa isn’t intended to be a replacement, says Aggarwal (whether users view it as a replacement remains to be seen), but an additional tool that users can interact with daily.
“60 percent of the heard and validated, but if they’re given self-help techniques, they can work on it themselves and feel better,” Aggarwal continues. Aggarwal says that Wysa’s approach has been refined through conversations with users and therapist input.
For instance, while conversing with a user, Wysa will first categorize their statements and then assign a type of therapy based on those responses, like cognitive behavioral therapy or acceptance and commitment therapy. It would then select a line of questioning or therapeutic technique written ahead of time by a therapist and converse with the user. Wysa, says Aggarwal, has been gleaning insights from over 100 million conversations unfolding this way.
“Take, for instance, a situation where you’re angry at somebody else. Originally our therapists would come up with the technique, where you’re trying to look at it from the other person’s perspective. We found that when a person felt powerless or had trust issues, like teens and parents, the techniques the therapists gave weren’t working,” she says.
“There are 10,000 trust issues refusing to do the empty chair exercise. So we have to find another of helping them. These insights have built Wysa.”
Although Wysa has been refined, . Pediatricians at the University of Cincinnati helped specifically targeted toward COVID-19 anxiety. There are also ongoing studies of Wysa’s ability to help people cope with consequences from chronic pain, arthritis, and diabetes at Washington University in St. Louis and The University of New Brunswick.
Still, Wysa has had several tests in the . In 2020, the licensed Wysa and provided the service for free to help cope with the emotional fallout of the coronavirus pandemic. Wysa is also offered through Aetna’s company to supplement Aetna’s Employee Assistance Program.
The biggest concern about apps is that they might accidentally trigger an incident or mistake signs of self-harm. The UK’s National (NHS) offers specific compliance standards to address this. Wysa complies with the NHS’ the to earn the distinction.
To meet those guidelines, Wysa appointed a clinical safety officer and was required to create “escalation paths” for people who showed signs of self-harm. Wysa, says Aggarwal, is also designed to flag responses to self-harm, abuse, , or trauma. If a user’s responses fall into those categories, Wysa will a crisis line.
In the US, the Wysa app that anyone can download, says Aggarwal, fits the of a general wellness app or a “low-risk device.” That’s relevant because, during the pandemic, the FDA has these apps.
Still, Wysa may not perfectly categorize each . A , for instance, noted that the app didn’t appear to appreciate the severity of a proposed underage sexual encounter. Wysa responded by of sex.
Aggarwal also notes that Wysa contains a manual list of sentences, often slang, that they know the AI won’t catch or accurately categorize as harmful. Those are manually updated to ensure that Wysa responds appropriately. “Our rule is that [the response] can be 80% appropriate but 0% triggering,” she says.
In the immediate future, Aggarwal a full-stack service. Rather than referring patients who receive a diagnosis to Employee Assistant Programs (as the Aetna partnership might) or outside therapists, Wysa aims to build its network of suppliers. On the tech side, they’re planning an expansion into Spanish and will start investigating a voice-based system based on guidance from the Google Assistant