Oh, boy.
I noticed the headline a few weeks ago. ‘Women are using ChatGPT as a free therapist – can AI replace mental health experts?’ This hadn’t been the first time I’d seen a similar piece: Dazed published a feature late last year on a similar theme.
I frowned, but didn’t pay it much attention… until I a friend told me recently that she did the same. She enjoyed using ChatGPT (and that OpenAI LLM specifically, not any other ones) for therapy.
I must confess, I didn’t quite know how to respond. Do I say hey, have you thought of the security and surveillance implications of giving your deepest and darkest thoughts to a profit-driven company that retains everything you input, permanently?1 Not sure that would have been the most convincing approach. Do I instead try, hey, is there stuff you’d like to talk to us (your friends) about that you don’t feel comfortable sharing at the moment? What can we do to help bridge that gap? Or maybe what would suffice was the the much more open ended, why?
The impression I got in the end was similar to the women interviewed in the piece above. People enjoy the ease of engagement and the non-judgmental space to ask all the ‘the crass, the gruesome, the almost cruel questions’ that they don’t feel comfortable asking elsewhere. It’s about, in the words of Fortune 500 AI Advisor Allie Miller, psychological safety.
“Our best work comes when we have psychological safety, and why would that be any different working alongside AI? The lack of judgement and unrestricted exploration makes it an ideal playground for big dreams, potentially embarrassing questions, or hazy, half-formed goals.2”
THE NEW NORMAL
With all that said, I was then not entirely surprised to read a recent HBR piece indicating therapy and companionship was the main way people were using GenAI in 2025. So perhaps GenAI won’t take your job, but it might take your therapist?
Listen, I’m still digesting the results of the survey. It’s interesting to see how much it’s changed from 2024, perhaps indicating that so many of us are still figuring out what it is useful for.
To be honest, I’ve just not used much GenAI in my life yet. I’m not opposed to it, more I haven’t particularly found a need. But I’m curious about what value it can actually provide (I must say, it’s decent at giving me a 200 word bio… and I did update my about section with its ‘take’. Would love to hear what you think!)
All in all, I wasn’t entirely mad at all the quotes that came out of the study. For example, the author shares on therapy/companionship (#1) this reflection:
“Where I’m from, in South Africa, mental healthcare barely exists; there’s a psychologist for 1 in every 100,000 people and a psychiatrist for 1 in every 300,000 people. Large language models are accessible to everyone, and they can help. Unfortunately, data safety is not a concern when your health is deteriorating, and survival is the morning agenda.”
Fair enough, right? Or not? I feel torn, but I suppose my instinct is - as a person living in closer proximity to the power sources of these companies (certainly in comparison to someone in South Africa) - it reminds me of my responsibility to push for more security and privacy for all users. This is important not just for me, but to avoid the risk of neo-techno-colonialism, or data theft from those who can’t protect themselves, that is rampant.
Other quotes from the study remind me that I don’t hate technology (!), I’m just uber uncomfortable with the idea we either accept Sam Altman’s version of reality (where we all lose and him and his mates shoot off to Mars) or we’re backward Luddites who will eventually waste away, jobless and alone. Some even gave me ideas for how I could use a specific (safe) product in my own life3…
People said things like “I asked it to create a timeline for me to clean and organize my house before we have guests staying,” or “I wanted an extensive vacation itinerary with many details, such as rustic places to stay and to eat, key things to see, and hidden gems, while minimizing driving time. The output was perfect.” All very sensible. My favourite though?
“I received a Penalty Charge Notice (PCN) for entering a bus lane. The council were after £80 from me for what was about a 20-second stop, if that. I asked ChatGPT to write me an appeal and this morning got the letter saying the PCN has been voided. Thank you, AI, because I would have likely just paid the money if I had to type out a long, boring appeal letter myself.”
Perhaps GenAI won’t take your job, but it might take away your therapist?
IS AN LLM A GOOD THERAPIST?
Coming back to the question at hand. Opinions on an LLM’s effectiveness as therapists vary, but by and large experts all seem to agree that it shouldn’t be considered a one-to-one replacement.
This early 2024 study suggested ‘ChatGPT offers an interesting complement to psychotherapy and an easily accessible, good (and currently free) place to go for people with mental-health problems who have not yet sought professional help and have no psychotherapeutic experience.’ But, they caution, it is not a substitute and the solutions offered are hardly comprehensive.
Another study4, focused on the client’s response to advice, complicates the picture. Participants were asked to determine if a response was from ChatGPT or from a human therapist. Curiously, they were not very good at telling the difference - but this is not the same as whether or not the therapy was effective. It does go some way to explaining why people feel comfortable confiding in the chatbots though; if you feel like it sounds ‘like a human’, then you won’t experience the mental bump that might stop you being honest.
I appreciated this conclusion: AI therapy is like journaling (if your journal was being sold to companies with a profit motive and state actors who love repression). At best, they suggest, it is ‘a structured tool for self-reflection,’ whereas at worst, ‘it gives the illusion of healing, while keeping you stuck in your own head.’
So, as always with this theme, the jury is still out.
What do you all think? Have you used ChatGPT or any other locally hosted LLM for therapy or journaling? Do you think it is something we shouldn’t judge? Or do you think, as an interlocutor of mine recently suggested, ‘we should bring back shame’ in relation to this tool? Would love to hear your thoughts!
OpenAI retains the conversation history of personal accounts, and there is no setting available to alter this data retention policy. Users can manually delete conversations, removing them from their account. However, these conversations are retained on OpenAI servers for up to 30 days to monitor for regulatory violations.
Memory — OpenAI uses memory to store user information for customizing future responses. This creates a permanent record of private information. More here.
https://archive.ph/z4TdH#selection-1139.105-1147.273
The bar is so high though, on the privacy front for me, that I just don’t feel compelled to give too much away if the result is just ‘get a decent itinerary’. Might be just me though.
The actual study is worth checking out, but this conclusion is striking:
In a large sample (N = 830), we showed that a) participants could rarely tell the difference between responses written by ChatGPT and responses written by a therapist, b) the responses written by ChatGPT were generally rated higher in key psychotherapy principles, and c) the language patterns between ChatGPT and therapists were different. Using different measures, we then confirmed that responses written by ChatGPT were rated higher than the therapist’s responses suggesting these differences may be explained by part-of-speech and response sentiment. This may be an early indication that ChatGPT has the potential to improve psychotherapeutic processes. We anticipate that this work may lead to the development of different methods of testing and creating psychotherapeutic interventions.
Yeah it is really wild isn't it? I have found in chronic illness communities I don't know what to say when people are like "chat GPT is my friend/therapist/health coach". But it gives me a whole ick. 🙃😩
As promised to anyone interested: a small range of free or cheap non ai resources.
Please note I live with someone who has complex health needs & these resources have helped me/continue to help me navigate some significant events & feel less isolated.
The Psych Collective
Is based in Wollongong, and have free downloads & cheap courses resources. Their video in distress helped me understand more clearly the role of the vagus nervous system. They a free download:
Conquer Anger and Panic
This eBook distils some essentials from our Surviving Distress Course - Master Distress with expert techniques, not just pills. Possibly less helpful if you live in a war zone.
To Write Love on her arms aka TWOLHA is an amazing organisation in America.
I get weekly blogs written by people with lived experience of suicidal ideation, self harm, binge eating, and a whole lot more. They have a free helpline and a great list of resources across the USA.
TWOLHA also invests directly in mental health treatment through their Treatment & Recovery Scholarship Fund. According to their website, they have invested a total of $4.3 million into inpatient and outpatient treatment that people otherwise could not afford, including 31,891 hours of therapy.
Check them out via instagram, website etc. Great for young people & LGBTQIA community.
Additude: an online resource for those with add/ adhd/autism diagnoses. They often have free webinars as which are recorded.
Weekly emails
Headspace: a meditation storytelling app - that will always discount if asked. My partner uses it every night.
Somatic self care: via "office of well being" at John Hopkins
Great short exercises to centre your breath and body for assisting fight or flight mode.
The office of wellbeing offers resources for John Hopkins staff but some are readable by anyone.
Politically depressed - a podcast & you tube video blog which is part of " The Fire these times" collective which are an amazing group of people Lebanon, Syria, Palestine; Sudan etc.
Their content is vital to help anyone understand the history of regions,colonialism etc.
Engaging in community is good therapy.
Ayman has two recent video essays on you tube: "My Therapist broke up with me" &
"Is Psychology Dead?"
The Practice: by Liz Milani, an Australian writer.
Therapy and Insight for those reconstructing their faith or leaving church communities (seen through the Christian experience). I read it most days to just slow down.
free on instagram/daily email/app $19 for a year on Google play or other stores
SANE: an Australian organisation.
SANE started as a group of people with various psychiatric diagnoses who felt they often better understood their own conditions. Its grown to be a funded organisation with free forums (available globally); free 6 session counselling programs with peer workers or "professionals" plus webibars and groups.
Peer mental health workers:
The UK, Canada, & Australia are now training and offering affordable support through programs with peer workers, i.e. people with lived experience who have undergone training to assist people. Frankly, it's cheaper to employ a peer worker but it does mean there a greater range of accessible programs out there if you look on government websites.
All of these resources offer connection. Reading the comments what I get is a sense that we have or are becoming too comfortable relying on professional help online during covid and chat gpt therapy may, in my opinion, reduce our willingness to seek out connection.
But If you need help get it anywhere it helps you, just protect yourself.
Finally: Serving others is therapy.
The best thing I've done since covid days is volunteer with a few organisations a few hours each week.
In helping others I've made broader connections with my new community & I learn so mych by listening.
Thanks Yassmin. 🙏💜