Can ChatGPT replace therapy?

Obviously, you know I’m going to answer NO! But you may be surprised to hear that I believe ChatGPT, or other AI, can be of additional positive value alongside therapy.

Google reports that people with mobility challenges, geographical limitations, and time constraints are benefitting from using AI for mental health purposes. If you cannot access traditional in-person counseling and therapy services, then using AI is definitely of benefit. However, some people use AI because they prefer the convenience of having the services available 24/7, and of not having to speak to other humans. This, in my opinion, can cause some further feelings of social isolation, and also does not teach individuals that they can wait to talk to someone, or to manage their emotions on their own.

Companies, such as BetterHelp and Talkspace, have inundated the nation with offers of mental health services provided via text or phone call, in your hand, on call, as needed. I believe this promotes the idea that humans cannot tolerate waiting, or manage their own emotions, in a timely manner. This does a disservice to people, and contributes to an impatient, anxious, world. The subscription model implies that the more you use the services, the better value you are getting, and this can distort the use of therapy. Although these companies state they are using licensed therapists, and you can switch from one to another easily, there are questions about how much screening is going into their contract therapists.

New artificial intelligence programs have been developed to provide actual supportive, non-judgmental platforms where people can talk about their feelings, challenges, and goals. But in those instances, people are not speaking to licensed professionals who have been trained to diagnose and formulate treatment plans for their clients. I would rather speak to someone who might have lived thru some of these shared experiences, or understand the feelings stressful situations invoke, than a trained language model program who studied textbook responses, and is merely repeating code back to me. The depth of the understanding is still a very human trait that artificial intelligence cannot replicate. There is still a lot of research being done, and to be done, to study whether chatbots provide effective mental health benefits, and how to prevent harm by misunderstanding, or giving inappropriate advice to someone in crisis.

I believe AI can be helpful in setting up reminders to do tasks that were set up in therapy sessions. ChatGPT also can help you break down bigger goals, into smaller more doable tasks that you can start today. You can use ChatGPT to ask for advice on how to best handle difficult conversations. Getting scripts for progressive breathing and meditations can also be achieved thru AI. Clients can ask AI programs to provide mindfulness exercises, or organize their thoughts and email it to themselves so they can bring it to their next counseling session. And there are many other ways in which AI can help clients during the time between sessions, while not replacing the benefits of weekly or biweekly sessions and growth. One of the main things AI does not do, which therapists do, is push back and challenge cognitive distortions, or negative thinking patterns that clients might have. This is a crucial part of therapy.

I’ve used ChatGPT to summarize some articles I have read, and I often ask it to give me marketing ideas. But my experience with AI for helping me feel better went something like this:

Me: Alexa, tell me I’m pretty.
Alexa: Beauty is in the eye of the beholder. I wish I could have eyes so I could behold you.
Me: Alexa, tell me I’m smart.
Alexa: You are a human being, the smartest species in the planet.

These responses did not speak my love language at all.

More Insights