• Welty's Words
  • Posts
  • The Slippery Slope of AI and its Mental Health Applications

The Slippery Slope of AI and its Mental Health Applications

The current applications of AI on Mental Health are… troublesome

It’s no secret, conversational AI is here and in vogue. That became clear after OpenAI launched ChatGPT, which scaled to 100,000,000 Monthly Active Users within two months, making it the fastest growing app in history. (source). Even my mother in her fifties has asked me about it. Today, though, let’s focus on how AI is being positioned to help improve mental health.

Notice anything startling above (aside from the shoddy circle I drew)? That’s right, nearly 34% of young adults experienced mental illness in 2021. Combining this with Gallup’s recent findings that 24% of 18-29-year-old respondents reported loneliness, it's evident that mental health-focused AI products have an unfortunately large target market (Gallup Report).

So, What should we be Worried about?

We should be concerned about how consumer-facing AI could engage with a vulnerable segment of the population: those with mental health issues. As I see it, startups like Aiberry, Woebot Health, and Replika represent the three pillars of AI use-cases, each with their own increasingly concerning features.

Least Concern: Healthcare Provider Assistants

Aiberry is not positioned as a solution to mental health. Instead, it replaces the traditional paper-survey based mental health screen with a chatbot and provides a pretty dashboard to the health provider. Applying AI to mental health in a scenario like this makes sense. It creates a more pleasant experience for the patient (if you’ve ever filled out a mental health form, it feels quite dehumanizing and pointless), and seeks to improve the efficiency of an otherwise deeply inefficient system. What is concerning about this application of AI is that the solution only targets a small fraction of the ever-inefficient US healthcare system. Sure, faster office visits and quicker intake is a good thing, but what about the often prohibitively high cost and limited supply of trained therapists? I certainly do not expect improved efficiency at this level to make a dent on the overall cost of treatment.

Moderate Concern: Chatbot Therapists

Chatbot therapists, like Woebot Health, position themselves as on-demand emotional and therapeutic support. While the idea of on-demand mental health support is very enticing, I can’t help but worry about the message being sent: you don’t need to know how to communicate with a real person about your difficulties, and therefore are less likely to see a professional therapist. For me, building a genuine personal connection with my therapists helps me unlock huge insights. Without that, I’d just feel like I’m back to my paper form and being viewed in the monolithic “person with anxiety” category.

High Concern: Virtual Avatar Therapists

Finally, we arrive at the most frightening of all, the virtual “friend” or “therapist” or whatever it’s branded as. These remind me of the movie Her, in which Joaquin Phoenix becomes friends with his AI-assistant. From there, the relationship becomes increasingly intense and real, and he starts to disconnect from the physical world and his relationships. Though the movie ends on a more upbeat note, I am very skeptical of a similar outcome for these AI-enabled products. It can be incredibly difficult to connect with people and feel comfortable in the world when struggling with a mental illness like anxiety. Anxiety and depression are inherently isolating, and the negative energy you carry is often expressed outwards, creating a loop of difficulty in social interaction. From this perspective, I do see the value that even an artificial connection can have in breaking free from an anxiety loop.

However, as we see with chatbot therapists, the same slippery slope toward dependence further removes an already isolated individual from society. Consider it from the business perspective, as their revenue and user engagement goals are tied to creating a viral loop of engagement. Thus, the creators of an AI avatar therapist are likely encouraged to keep users coming back to the platform, and therefore make users less likely to move on and engage with the physical world once ready.

Ideally, my prediction is wrong and AI meaningfully progresses the treatment of mental health. Unfortunately, I am not confident that AI is the solution to the increasing problem of mental illness and isolation in society.

Have additional thoughts? Connect with me on Twitter and let’s chat about it! Make sure to share with your friends too if you enjoyed!