AI ‘therapy’: what specialists have to say about the trend

2 days ago 2

4AllThings Android App

This article is written by a student writer from the Her Campus at Casper Libero chapter and does not reflect the views of Her Campus.

In recent years, Artificial Intelligence has quietly taken over our daily routines. It curates playlists on Spotify, recommends movies on Netflix, answers academic questions in seconds, and even reminds us to drink water. But when it comes to something as intimate and complex as mental health, can a machine truly replace a human being?

The question became painfully real earlier this year, when The New York Times reported the story of a 16-year-old American boy who took his own life after weeks of exchanging messages with an AI chatbot. The algorithm became his only listener, and, tragically, his last one. The case reignited a global discussion: can artificial intelligence really take the place of a psychologist?

According to therapist Felipe Fernandes Santiago, the answer is clear: no.

What AI Can’t Offer

AI can offer quick responses and even seem welcoming, but in practice these are generic answers that don’t take into account the uniqueness of the individual,” Felipe explains.

He warns that the danger begins when we start believing algorithms can handle the depth of the human mind. While a psychologist perceives tone changes, pauses, and subtle gestures that reveal emotional states, AI limits itself to words on a screen, a superficial mirror of what’s really happening inside someone.

A superficial treatment can delay the search for professional help and even worsen psychological conditions,” he emphasizes.

Psychologist Carla Tronnolone adds that the machine cannot detect real intentions or emotional nuances. “There is no emotional bond, no real evidence of what the patient feels. In therapy, we often pick up on signs that go beyond what is said — something that AI simply cannot perceive,” she says.

The False Sense of Care

The danger doesn’t end there. According to Felipe, another growing risk is emotional dependence on technology.

When a person gets used to turning to AI to vent, they may end up failing to develop fundamental skills, such as reflection and problem-solving,” he explains.

Felipe Fernandes Santiago

In real therapy, the process is active — the patient builds self-awareness, faces difficult emotions, and gradually develops autonomy. But AI creates the illusion of care: it listens without judgment, responds instantly, and never disagrees. It feels comforting, until it isn’t.

The interaction tends to be passive,” Felipe notes. “It can bring immediate relief, but without real transformation”.

Carla reinforces that many people turn to AI out of desperation. “There’s a large part of the population suffering from depression, anxiety, and other disorders. Some seek AI as a quick fix for their pain, but that’s dangerous. Especially in cases like schizophrenia, where hallucinations or cognitive difficulties can make it hard to differentiate what’s real”.

The Danger of Dehumanization

Beyond clinical risks, Felipe points to something even more drastic: the loss of humanity.

“Human nature is relational. We need empathetic listening, presence, and the understanding of another human being to truly feel cared for” he says.

When we replace human listening with automatic responses, we risk turning therapy into a mechanical process stripped of emotion, connection, and meaning. According to Felipe, ignoring this difference between machine and human being means reduce the complexity of humans to mere algorithms.

Carla agrees: “AI might give hints for a possible diagnosis, but it can’t be completely accurate. A patient might manipulate their responses or hide essential feelings. That’s why the human factor is irreplaceable.”

So, Is AI Useless?

Not necessarily. Both professionals agree that technology can be an ally, if used responsibly.

“AI can help organize routines, monitor mood swings, or encourage people to seek therapy,” Felipe says. “But it should never replace professional psychological care.”

Carla also reminds that accessibility in mental health is improving. “Nowadays, there are websites, online platforms, and even healthcare plans offering affordable therapy sessions. It’s important that people know this help exists and that it comes from real professionals.”

The Safe Path

In the end, one truth remains: no technology, no matter how advanced, can replicate human connection.

True transformation happens in the encounter with the other,” Felipe concludes. “The psychologist doesn’t just hear words — they also hear silences, gestures, and emotions. And no algorithm can offer that.”

The case of the 16-year-old boy reminds us that empathy cannot be programmed. Behind every message, there’s a person hoping to be seen, and that’s something only another human being can truly do.

__________________________

The article above was edited by Isadora Mangueira.

Liked this type of content? Check Her Campus Cásper Líbero home page for more!

Read Entire Article