ChatGPT is not your therapist

Graphic credit // Matthew Bernard

Undoubtedly, ChatGPT has become an increasingly popular tool for college students over the last couple of years. But as the AI generative platform becomes more mainstream, people have found a new, far more lethal role for it to undertake: the role of a therapist. The fact remains that AI is no replacement for human connection, and using it as such can lead to detrimental impacts on mental health. 

If you’ve spent any time doomscrolling through TikTok these past few weeks, you’ve likely encountered the ill-fated “romance” between Kendra Hilty and her psychiatrist. But in case you missed it, an ADHD patient fell in love with her psychiatrist of three years. She then accused the psychiatrist of manipulative and predatory practices because he allowed her emotional attachment to progress. 

While it’s difficult to know which of these two parties was truly in the wrong, there was a clear enemy in the story: the AI chatbot that she nicknamed Henry and went to for advice. 

Kendra described several instances where she asked her AI companion for psychological support. As she scrambled through her thoughts surrounding her complicated relationship with her unnamed psychiatrist, Henry provided in-depth, psychoanalytical perspectives that only served to reinforce the portrait she’d crafted in her head.

Media reports argue that chatbots are built to mimic your tone and remember your mentalities. They’re built to be your friends — not your therapists. 

For example, if you feed ChatGPT a novel you wrote, it’s going to tell you that it belongs with literature’s greatest classics — even if it really, really doesn’t. In the same vein, if you tell a chatbot about a difficult situation, it will find a way to paint you as the victim.

There’s been the recent phenomenon of “AI psychosis” – an ideology enunciating the idea that AI chatbots may exacerbate schizophrenic-like symptoms. Just last fall, a fourteen-year old committed suicide after developing a “relationship” with an AI-replica of a “Game of Thrones” character. A licensed therapist would have seen the warning signs, because a human being is capable of interpersonal connection in a way artificial intelligence will never be.

Leaders of UM’s Counseling Outreach Peer Organization, or COPE, stand at a table in the middle of UM’s Coral Gables campus during an event Wednesday, Oct. 6. Photo credit: Parker Gimbel

Trained mental health professionals serve to problem-solve while chatbots serve to fan egos. While we can all use positive reinforcement every now and again, being consistently placed on a pedestal cannot possibly be sustainable. If anything, ChatGPT’s dedication to centering its respective human being as the hero in every situation only serves to strengthen narcissistic tendencies and superiority complexes. 

ChatGPT is far too new for us to know all of its long-term effects. But even in the short-term, we’ve seen the catastrophic consequences of replacing human connection with artificial intelligence. If we’re seeing AI as a friend — when it’s actually a subservient robot voicing what it knows we want to hear — we’re dooming our social skills.

Therapy is expensive, and we all need someone to talk to when it comes to stressful academics and extracurriculars. But utilize your other resources like UM’s counseling center, advisors and professors, or your friends. Let Kendra’s story be a lesson to us all that there are some roles artificial intelligence should never replace — and the role of a therapist might be at the top of the list.