The plural of anecdote is not “data,” but it’s worth noting that at least in the U.S., what’s available without endless money is “just stop wanting what you want, and you’ll be far happier.”
“Also, here’s some pills to numb you.” Given the choice between expensive dehumanization and a free app, can you really blame them?
In case anyone’s like me and was curious if such a claim might’ve had a conflict-of-interest in it, have a look at the site’s logo:
I’m gonna go with “yes.”
At least they acknowledge that current AI chat bots aren’t fit to provide therapy.
Men will literally go to a sycophantic predictive text model than go to an actual therapist who would actually help them.
Some therapists use AI themselves! Might as well skip the middle man?
Declan would never have found out his therapist was using ChatGPT had it not been for a technical mishap. The connection was patchy during one of their online sessions, so Declan suggested they turn off their video feeds. Instead, his therapist began inadvertently sharing his screen.
“Suddenly, I was watching him use ChatGPT,” says Declan, 31, who lives in Los Angeles. “He was taking what I was saying and putting it into ChatGPT, and then summarizing or cherry-picking answers.”
https://www.technologyreview.com/2025/09/02/1122871/therapists-using-chatgpt-secretly/
Or just not go to therapists that use Machine Learning, if possible.
It’s well documented those glorified text predictors make people feel and act worse.