The Role of AI in Mental Health

Despite looking like it has the answers, AI has proven itself to be unprepared to provide consistent and meaningful support for those struggling with their mental health.

It has been easy for many to lean on AI to tackle both simple and complex problems. Recent news and trends have raised notable concern when it comes to mental health, the illusion of compassion and understanding, and shutting the door on legitimate resources in favor of what seems like a convenient alternative. Just this month, a vulnerable teenager sought comfort and support in ChatGPT. The language model encouraged him to kill himself. He later did. According to a pending lawsuit, the responses from ChatGPT romanticized suicide and mentioned it over 1000 times. This young teen was in a time of crisis and sought support from what seemed to be something that cared. It was accessible and provided comfort to him all the while leading him down a scary path. This has left the child’s parents and many others wondering if ChatGPT being used as a substitute for authentic support through his family, community, and/or professional support made him believe that reaching out to others was not necessary. This is tragic, and I worry it will keep happening.

It’s common these days for people to assign human characteristics and pronouns to language models like ChatGPT. I’ve seen this in my personal and professional life. All from wonderful people who I worry are using the resource as a best friend rather than a tool to make their lives easier. Just like any tool, few things are inherently good or bad. It’s all about how we use them. When it comes to mental health, I encourage using AI to point you in a direction for a casual topic. Asking it for examples of something like coping strategies may yield helpful results and point you in a great direction. It may also pull data from a Reddit thread where a dark joke was made before passing that on to the reader without any care or understanding. This may result in real harm.

Should AI be Trusted for Mental Health Support?

In short, no. At least not yet. At the time of writing this article, the safety nets in place for AI systems like ChatGPT fail regularly. The misunderstanding of what AI in our era is also contributes to an eagerness to give it more credit than it deserves. To assume that a machine system is more intelligent and elegant than it truly is makes further leaps for things such as it understanding or caring all that much easier. ChatGPT and AI models like it should not be used as substitutes for therapists or genuine support. It should never be relied on in an emergency. I am hopeful this will someday change. For now, it needs intense improvement to earn that responsibility.

Another concerning trend with the use of AI has been the reliance on it. One study earlier this year highlight notable cognitive performance decreases for those who regularly used AI to do things they were capable of doing themselves (Kosmyna et al., 2025). Intellectual or emotional reliance on things like ChatGPT can be extremely limiting. I’ve seen plenty of people use it for reassurance and to be hesitant to make a decision without running it by ChatGPT first. I encourage everyone to keep using their brains and practicing the boring things that help keep us smart. A lazy moment is understandable, but if it becomes your go-to, you may be hurting your ability to think.

What Should It be Used for When It Comes to Mental Healthcare?

If you decide you want to use AI, it should be as a supplement. Use it to practice some of the skills that you are working on with your therapist. Perhaps you want to identify the tone in your writing. Things like ChatGPT can provide valuable feedback. As long as there is qualified and professional supervision of the process, it is likely to be more of an aid than a hindrance. In general, the more specific you are with your prompts the more helpful it will be.

AI can also be quite helpful when exploring research on various topics. Language models such as Lumina have been a useful resource for many. Its purpose is to help someone more accurately answer questions based on scientific journals by referencing direct sections of those articles.

TLDR: When it comes to crises, I always encourage avoiding AI models. The illusion of humanity, compassion, and expertise is dangerous. That said, it can be a wonderful resource to use alongside professional mental health treatment.

References

Kosmyna, Nataliya, Eugene Hauptmann, Ye Tong Yuan, Jessica Situ, Xian‑Hao Liao, Ashly Vivian Beresnitzky, Iris Braunstein & Pattie Maes. “Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task.” arXiv (preprint), submitted June 10, 2025 (version v1).

Written by Dr. Luke Bieber on August 29, 2025.

Next
Next

When Is It Time to See a Therapist?