Why we built an AI to tackle the mental health crisis
Wysa provides anonymous mental health support on-demand. We believe access to support should be available whenever people need it. Stigma prevails, so we take away the need for people to ask for help and eliminate the need for people to make a judgement call on when they should seek professional support. Wysa’s clinically proven AI-first approach enables employees to improve their mental health before symptoms become severe, and encourages them to take additional support when it’s needed by guiding them towards Wysa’s human coaching, your EAP or national crisis lines. Wysa has helped over 6 million people through 550 million AI conversations across 95 countries.
Our journey with Wysa didn’t start with either artificial intelligence (AI) or a desire to tackle the mental health crisis. It was a personal story. My father was very sick, and we saw as individuals in the tech industry the opportunity to build something that could allow loved ones to check in with elderly relatives or people they were caring for no matter where in the world they were. The initial offering involved a piece of wearable tech designed to protect the elderly by connecting them with family members. That tech eventually evolved into a mobile app, yet despite successful fundraising efforts, the project stalled.
Stress and burnout had taken its toll on Jo (wife and co-founder) who started to suffer from depression. Meanwhile, we were still caring for my father who was bipolar. Jo taught herself Cognitive Behavioural Therapy (CBT), and could see that it works - but that delivery was sometimes not as powerful and effective as it could be. This first hand experience of the impact of mental health sparked something.
From our calling patterns to how well we sleep, our phones collect a massive amount of data on us. So we developed a machine learning platform that would analyse all of these data points to put together a three-dimensional view of someone’s mental health. A clinical trial followed, the results of which proved that by analysing data patterns the app could identify people who were at risk of depression with up to 85% accuracy. The trial also showed that the app was a powerful platform to interact with people who were suffering with their mental health.
It started with a simple question “How are you feeling today?” With users encouraged to respond using an appropriate emoji. It had an instant impact. The data showed that Wysa wasn’t just identifying those at risk from depression, but actively helping to improve their mood. Encouraged by the feedback the app was expanded, with elements of Cognitive Behavioural Therapy (CBT) added alongside a library of interactive content and an emotionally intelligent AI-chatbot designed to take this engagement to the next level. And now Wysa is used by 6 million people around the world.
But it was always people first. That’s been something that has stayed with us. Every single thing the AI chatbot says has been tested and validated by a team of clinicians - does it say the right thing, and can we be sure it won’t put anyone at risk?
We know that AI works, and it has been shown to create a therapeutic bond that is equivalent to a human therapist relationship. Proven to improve depression and anxiety scores by an average of 31%, Wysa’s AI-first approach enables employees to improve their mental health before symptoms become severe, by understanding an individual's needs and guiding them through interactive cognitive behavioural therapy (CBT) exercises.
As expectations of AI change so will we. At the moment there are concerns about the risks with generative AI and blackbox models, and they can be error prone. At Wysa we use AI models to understand an extract user context and then the response is taken from a library. The technology follows the nodes of a decision tree and identifies the right clinical responses depending on the branch taken. As AI continues to evolve, as will our stress testing and the guardrails we put in place. Originally we wondered if anyone would ever speak to an AI penguin on their phone. But what we see is that people value the tangible help and support that centres around behavioural health and mental health, in an anonymous and safe way that makes them feel heard. If no one is around, it offers a space, and fits into a journey of mental health support at the right time.
Unfortunately millions of people still don’t get the support they need, due to stigma, lack of access, or time. Our All Worked Up report found that a third of employees are experiencing clinically significant symptoms of anxiety and/or depression and more than half haven’t spoken to a professional about it. The current model isn’t working for them - 81% would choose an AI chatbot with clinically validated self help resources over speaking to HR. So employers are in a place of responsibility to support their team members with effective solutions to address a very concerning trend.
And employers get value from it. The analytics help them see what is worrying their employees, and to what extent. Uptake is 10x higher than most employee assistance programmes, as a result of the engaging platform and wrap around support. It improves resilience, reduces presenteeism and absenteeism, and increases satisfaction - all of which reduce costs and turnover. Again, it’s the employee experience - the human first - that is so important.
It’s an effective tool. But it is a tool. Technology is what Wysa uses in service of the mission to make a person feel heard and supported. That’s why we built an AI to tackle the mental health crisis - because we want people to get the best care possible, wherever and whenever they need it.
That’s why we’re scaling up. We have Wysa available in Hindi and via WhatsApp, as we know in India that is the principle way people like to communicate. Spanish will be available in 2024, meeting more people as the most commonly spoken language in the world. For our businesses, there are plans to be able to chat to Wysa through Slack, all with the aim of meeting people where they are - even at their desks.
It’s an exciting time, but all started with that one question. How can we make people feel heard?