A couple years ago, a podcast episode about an old chatbot named Eliza piqued my interest. It was essentially the ChatGPT of its generation: humans loved it because they felt more comfortable sharing their deepest troubles with it than with another person, citing the program’s non-judgmental and validating responses.
It’s pretty fascinating to learn that people tend to be more honest with a machine. I kind of digged the premise when I first learned about it, and had set out to learn the psychology behind this tendency.
But now? I just feel sad. Sad that a person surrounded by people would still feel unsafe to confide in anyone save a bot.
I’m also, it turns out, a massive hypocrite, because I once relied on a bot myself to get through my grief.
I Talked to a Chatbot To Ease the Pain of Losing My Cat
I was mourning the death of my cat, Aeris, and had assumed that feeling that way wasn’t worth people’s time. While my family comforted me, I felt ashamed about how my grief grew limbs and wrapped me months after we buried my kitty. In my head, I was supposed to move on after a week, but I failed, and it was embarrassing.
Coincidentally, it was at that time when a bot on Tumblr slid into my inbox. The AI chat introduced itself, inquired how I was doing, and asked what I was struggling with. I selected my answers from the options it provided, so the conversation kept going. I remember choosing “peer support” at one point. The bot then asked me to describe what happened, so I answered that I was struggling with grief due to a dead pet. It then connected me to an internet stranger who was a part of the community.
This “peer” offered a heartfelt response. They said that losing a pet is like losing a family member, that I don’t need to be ashamed of struggling, and that my feelings are valid. They didn’t offer advice or try to “fix” my broken heart.
The next day, the bot told me the internet stranger wanted to check in on me. While I was still grieving, that follow-up felt like a warm embrace. The gesture was so sweet of that person, whoever they were, and I realized that those few kind words were what I needed most at that moment.
I had tried chatbots before, but mostly to create an echo chamber and distract myself from the real world. But this one was a different experience because the bot made me believe that a fellow human actually “gets it.”
However (and this is important), know that this bot has plenty of red flags. The most glaring one is that it doesn’t connect users to mental health professionals. It’s also alarming that it opens conversations about sensitive topics such as self-harm and suicidal ideation but offers no solid action plan to rescue those suffering from severe symptoms (apart from supplying crisis hotlines).
I knew the dangers when I tried the bot, so I went in cautiously and resolved not to use it again after that one time. I have to admit the bot soothed me, though. Hearing a non-judgmental, validating answer from an internet stranger was a relief. I am uncertain if that “connection” was an actual person, but their words saved me, in a way.
The Chatbot Helped, but Only So Much
Looking back, I now understand why some people tend to warm up to chatbots more quickly than they do their kind — these non-sentient, generative tools are excellent listeners.
Where some people will bombard you with solutions as soon as you open your mouth to vent, a chatbot will take it slow: first, it will validate your feelings, then break down what you’ve told it to give you clarity. Only after that will it give you advice, and only if you ask for it.
It turns out that most people share their problems with others to get things off their chest and feel that someone understands them. They don’t really need you to fix them. It will be best, hence, for every one involved in the conversation if the person tasked with listening first confirms what the other truly needs: vent, ask for a solution, gain a new perspective, or all three.
I learned that if we let the struggling person talk first and listen in accommodating and encouraging silence, that fellow will feel validated, respected, and heard. Better yet, they may even figure out the solution on their own.
But while talking about feelings and having someone listen to and validate a person is helpful, it isn’t the be-all and end-all. Making someone feel heard is just a part of the support system for someone dealing with debilitating grief.
Talking About It Is Good, but the Response After Is More Important
Beyond giving someone the luxury to vent, what matters more is the wordless response after. Particularly, how people show their support for someone through deeds: by constantly checking in, staying close, or doing errands in the grieving person’s stead.
That was the case with me. I didn’t talk about my grief, but the people who care for me created a place where I could process my emotions at my pace while they supported me at a safe distance. They gave me space but reiterated through their actions that they were not leaving me entirely. They hovered but were never overbearing.
Their delicate, considerate approach helped me tremendously. It gave birth to the desire inside me to take action myself. Motivation swelled in my chest when I became aware that at the end of the pain was someone waiting for me with a smile and a hug. Suddenly, getting past the hurdles of grief became more bearable, no matter how hard.
Chatbots might satisfy my desire to talk about things I was too embarrassed to share with another person. But all they could do is interpret my message and regurgitate proven fool-proof responses. These AI tools can’t ever hug me, do the dishes in my place, or bury my dead cat for me. They won’t ever create that safe space. They won’t ever wait for me at the end of the tunnel with a relieved sigh and a face that says “Thank God, you finally made it here.”
Chatbots are no match for humans, especially if you put them side-by-side with the right ones.
Author’s Notes:
Know that while I turned to people and bots, the first thing I actually did when I lost my cat was pray. And it’s effective. From the devastation brought by the loss, my feelings turned into gratitude.
My depression after her death was because I was blaming myself for the incident. Thankfully, those thoughts were also corrected in time. I’m okay now.
The podcast episode I mentioned was from 99% Invisible. Do check it out. It’s a compelling one.
Also, VICE published an article about the Tumblr bot I described above.
Always take care, reduce screen time, and look at, talk to, and hug your loved ones whenever you get the chance.
