compass Explore next steps to improve your mental health. Get mental health help

Expert perspectives on ChatGPT therapy: Will AI become a larger part of the mental health field?

Expert perspectives on ChatGPT therapy: Will AI become a larger part of the mental health field?

Artificial intelligence has become a focal point of conversation in many fields, each looking to discern whether technology like chatbots or digital assistants will—or should—play a larger role. 

Now this question is being posed in the field of mental health. Though having access to therapy services via ChatGPT and other chatbots could help to improve accessibility, artificial intelligence is not equipped to provide such connection-driven and human-centered services—and it likely never will be. However, that does not mean it’s useless in the sphere of mental health.

What Is ChatGPT?

ChatGPT is a machine learning model from the company OpenAI specifically designed for conversation and content creation. It uses an amalgamation of different writing sources to “chat” with people or write different types of copy with intended near-human skill.

As it’s being developed and fine-tuned, the chatbot is being used to write high-level copy or communications for businesses. Though some examples of ChatGPT writing show a fair level of writing competence, there is a worry about quality and customer satisfaction if it is used for professional writing or communication on a large scale, rather than human professionals.

Can ChatGPT Provide Therapy?

Technically, anyone can provide therapy, including chatbots. However, not all therapy is professional or competent therapy – or, most importantly, therapy that works for the client. The central goal of therapy is to help people manage their unique mental health challenges and ultimately learn to live happier, more fulfilling lives. 

If something about the therapy someone is receiving isn’t working for them, the services, and therefore the client, are not reaching their full potential. In this sense, it would be important to assess the quality of ChatGPT’s therapy services and compare the format against individual clients’ needs. 

It’s also important to note that ChatGPT could never fully take over psychiatric services (therapy’s counterpart), as prescribing medication absolutely requires the nuances of in-person evaluation.

What Are Examples of Mental Health Chatbots?

There are a few apps out currently, such as Wysa and Limbic, that offer users the ability to have conversations led by AI chatbots, though most of them are only used in conjunction with talk therapy with a qualified counselor. However, other apps like Replika offer something like a companion, a bot that can chat with clients when they “need an empathetic friend.”

Are Chatbots Good for Mental Health?

The effects of chatbots on mental health depend on what they are used for; however, there are certain instances where they can have a positive impact. Hampton Lintorn-Catlin, Chief Technology Officer at Thriveworks, suggests that, “while the specific answers might not be trustworthy, ChatGPT can be very helpful as a sounding board.” Hampton has over 17 years of tech industry experience and is best known for creating the Sass language, the world’s first and most popular CSS language adopted by Google, Facebook, Apple, and Microsoft, among others. 

“As long as you are looking for answers from yourself, not your AI conversation partner, then it can be helpful. However, going to AI for advice or specific answers is a dangerous proposition as it has no real understanding of the world around it,” Lintorn-Catlin states.

However, if a chatbot is being used to provide real therapeutic services to someone, there’s less of a guarantee that the outcome will be helpful or “good” for one’s mental health. When a chatbot is giving mental health advice to someone who is not educated in that area, it can be difficult to know if the answers they’re receiving are accurate or not. Making diagnoses online without speaking to a qualified mental health professional is also a risky game, since many mental health conditions share presentations and symptoms. 

Ultimately, AI can provide commonly given advice that it’s seen across all types of platforms—but without a human ability to ask the right questions, distinguish one set of symptoms from another based on how the person feels, and understand what they mean when they describe their symptoms, there is still a missing component. Because of this, chatbots cannot provide comprehensive mental health services.

How Do Counselors Use ChatGPT?

Chatbots like ChatGPT can help counselors with their practice to a degree. Whether clinicians want new ideas to help them with their work or simply need to put their issues into words to help straighten them out, the chatbot could help by providing answers based off of collected information that can be adapted or vetoed accordingly by the clinician based on their expert knowledge. 

What Are the Advantages of Mental Health Chatbots?

As stated above, mental health bots could be helpful to clinicians (and others) looking for clarity when sorting through their thoughts and ideas. “If you are dealing with a personal issue or a problem at work, sometimes just the act of having a conversation with someone, even an untrusted person, can help you think through your ideas and help clarify your own thoughts on a subject,” explains Lintorn-Catlin.

The clarification between asking bots like ChatGPT questions and listening to its answers is important, though. As an experienced clinician or mental health professional, one might be able to see what ChatGPT has to say about a certain query and assess the accuracy of its statements, thereby fact-checking the information and its applicability. However, for untrained patients that are looking for real answers, chatbots could easily misdiagnose or misadvise. 

Though chatbots can theoretically provide logistical advice based on the facts presented and the information available to it, human clinicians have the ability to infer and provide a level of human insight that is often necessary for effective therapeutic services.

What Are the Disadvantages of Mental Health Chatbots?

Though there’s the obvious question of accuracy when considering a chatbot’s ability to advise and diagnose, there is also another glaring issue when it comes to using AI for psychotherapy: client comfortability. This comes in two parts.

Therapy Format

First, most AI platforms rely on text as the communication source, which often does not work for most individuals seeking mental health support. So, even if a chatbot could provide accurate and helpful advice for a therapeutic client, most people don’t thrive on therapy that doesn’t involve physically speaking to someone, AI or not.

Non-verbal cues and mental status observations are also very important in therapy. Text cannot read pauses or see the look on someone’s face, things that therapists are taught to read, understand, and treat accordingly, which brings us to our next point.

Ability to Connect and Foster Comfortability

A key part of therapy is the human-to-human connection—our second part. In order for therapy to be effective and do its work, clients need to be comfortable being honest and opening up about themselves—likely more honest than they’ve ever been to another person. It can be disconcerting to know that the other “person” talking or texting with you isn’t real, making it hard to reach the level of honesty necessary for real change.

Honesty is one of the most important factors in therapeutic treatment. If a client is uncomfortable being honest in their sessions, it could stall the growth that’s happening or prevent one from reaching their fullest potential in treatment — better relationships with patients are proven to lead to better outcomes. In addition, it could warp one’s emotional growth if the sessions begin working off of misguided advice.

ChatGPT has no real concept of what a client is going through and how to help them. It has been programmed to have certain responses to certain inputs, and, unfortunately, therapy is not that simple. In therapy, sometimes giving a client the “right answer” is not always the right thing to do. They may not always be ready to hear flat-out truths, and though AI might be able to find that right answer, it still does not necessarily have the insight, and certainly not the empathy, to understand the balance between truth and growth.

“The best mental health care comes when real human beings take the time to connect and one of them is specially trained to help the other person,” says Lintorn-Catlin. At Thriveworks, we are focused on using tech to support our clinicians in their effort to treat our clients. The purpose of our technology is to assist with delivery of care, not to be the care itself.

A man sitting on a paper plane

Hello, we're here to help you

We provide award-winning mental health services nationwide, with flexible scheduling & insurance coverage. Start your journey this week.

Will ChatGPT Replace Psychologists?

Likely not, similarly to how remote or online therapy won’t fully replace in-person services. Put simply, not everyone will be able to text or talk to someone who isn’t real about their mental health struggles. Therapy services specifically rely so much on connection, trust, and instincts that it is very hard to replicate simply by using information from other similar scenarios.

Psychology, though, may be a slightly simpler field to use AI in. Since psychologists offer several forms of psychological evaluations, situations that tend to be a bit more objective in nature as compared to talk therapy, AI may be a more reliable form of evaluation, much like AI in the medical model.

In the end, though, AI like ChatGPT will likely need more fine-tuning in order to provide mental health services on the level of human mental health experts today, and it will likely never negate the need for human mental health professionals in the therapeutic sphere.

  • Clinical writer
  • Editorial writer
  • Medical reviewer
  • 2 sources
Avatar photo

As Chief Counseling Officer, Ryan Culkin oversees the logistics and operations of all company clinics, telehealth, and clinicians nationwide. He brings over a decade of clinical experience and unique business perspective to Thriveworks.

Dr. Scott Gordon

Dr. Scott Gordon is Thriveworks’ Chief Medical Officer. He ensures that the mental health services provided are both safe and effective. In addition, Dr. Gordon provides strong, inspiring medical leadership to the health care professionals at Thriveworks and oversees daily operations.

Picture of woman in front of flowers
Hannah DeWittMental Health Writer

Discover Hannah DeWitt’s background and expertise, and explore their expert articles they’ve either written or contributed to on mental health and well-being.

We only use authoritative, trusted, and current sources in our articles. Read our editorial policy to learn more about our efforts to deliver factual, trustworthy information.

  • DeAngelis, T. (2019, November 1). Better relationships with patients lead to better outcomes. https://Www.apa.org. https://www.apa.org/monitor/2019/11/ce-corner-relationships

  • Bogost, I. (2022, December 7). ChatGPT Is Dumber Than You Think. The Atlantic. https://www.theatlantic.com/technology/archive/2022/12/chatgpt-openai-artificial-intelligence-writing-ethics/672386/

Disclaimer

The information on this page is not intended to replace assistance, diagnosis, or treatment from a clinical or medical professional. Readers are urged to seek professional help if they are struggling with a mental health condition or another health concern.

If you’re in a crisis, do not use this site. Please call the Suicide & Crisis Lifeline at 988 or use these resources to get immediate help.

Get the latest mental wellness tips and discussions, delivered straight to your inbox.