December 23, 2024

In Defense of Using ChatGPT to Text a Friend

6 min read
A digital representation of a bouquet of flowers with a card that says "sorry," on a yellow background

A friend recently sent me a link to a popular social-media post, a screenshot of a text message. The caption made clear that the text was the response someone had received from a friend after asking for support because of a divorce. “I’m so sorry to hear you’re going through such a tough time,” the text read. “It’s very normal to feel what you’re feeling for a while. Love is a hard come down.” The recipient was livid. The message was “weird.” Canned. Out of character. “Told another friend about it,” the tweet said, and “she said you can tell if an iMessage is ChatGPT if the apostrophes are straight, not curved. Well guess what.” No need to guess; the apostrophes were straight as arrows.

Since ChatGPT launched in 2022, it’s been a ghostwriter for all sorts of hard things—breakup texts, obituaries, wedding vows. The public reaction is usually negative: The machines are taking over, and they’re not particularly eloquent, and this is absolutely a sign of the downfall of society. My friend sent the (since-deleted) divorce post to me with a bit of eye-rolling gall. That made sense: “Love is a hard come down.” Strange condolences, live from the matrix. There was a certain absurdity to the whole situation, one person yearning for human comfort after a severing of human connection, the other (allegedly) providing it via a digital ghostwriter. Was saying something from the heart so hard?

Apparently, yes. And surmounting that difficulty is too much for many people; it’s why ghosting exists. So if someone is struggling to find the right words, or any words at all, perhaps turning to AI for help isn’t so bad. Reading that post, my thought was At least the person texted back.

Not knowing what to say to someone in need, someone in pain, is an eternal problem. Many of our rituals around death, for example, are ancient, and have adapted awkwardly to the modern era. Sitting shiva is not something you can really do in the metaverse. Are you supposed to mute yourself during a Zoom funeral? But the most basic, fundamental rule of support for those going through a hard time remains: Show up. Bring some bagels, buy someone a drink, log on. And if you’re so stumped by the need for human language in the face of sorrow that you use ChatGPT to formulate a thought for you, so be it.

After a particularly debilitating death in my own life, I found myself in need of a lot of comfort. In the immediate aftermath, I received it in abundance. But as time inevitably marched along and I was left alone, I often texted my friends. I wasn’t looking for a specific solution, just some company, a confirmation of my existence. Some sent warm platitudes, others recommended movies or sent me funny TikToks. I don’t really remember the specific conversations or who said exactly what, but I do remember who was present. More so, though, I remember who was not. I remember who didn’t respond at all.

Grief is a lonely and totalizing emotion. It can feel like you’re slipping into a black hole. So when you get the wherewithal to reach out to someone, you’re essentially asking them to pull you out of that hole. They’re on the other side; they have the power. Can they grab your hand? When they don’t respond, it can feel as if you’re sinking away while they look on.

In his book about the death of his two children, Finding the Words, Colin Campbell laments that so many friends, when trying to articulate their sympathy, landed on “There are no words.” He was hoping for more, arguing that the approach treated grieving “as a taboo subject that is too sensitive to discuss openly.” What I think this overlooks, though, is that There are no words are, in fact, literal words.

I’ve thought a lot about the people who didn’t respond to my messages, tried to puzzle out their motives (or lack thereof). What would make someone ghost a friend in pain? Having to commit to a sentiment, the intimidation of being called on to respond, can be overwhelming. Some may fear saying something that could inadvertently make the pain worse. For those scrambling to articulate the massive blob of feeling swelling within their mind, it may seem easier to simply not respond.

I considered this as I imagined the person on the other end of the viral tweet. Someone wanting to be helpful, available, conciliatory. Someone totally overwhelmed by the very alive and immediate need of their friend. I imagined them turning to ChatGPT, typing Help me write a note to my friend who is hurting from a recent divorce, and feeling that what they got back did resemble what they felt. I understood them sending it.

I imagined other scenarios that felt more plausible: someone asking their spouse, their parents, their therapist, their neighbor, their priest, their rabbi, anyone they trust, for guidance. I wondered if maybe asking ChatGPT for help wasn’t so different.

People are flummoxed by other people’s pain. Type What to say to someone into Google, and it turns up an endless scroll of scenarios: “with cancer,” “who is dying,” “who had a miscarriage.” For many tragic life milestones, we are without a handbook. Pain shape-shifts. What is comforting to one person may be upsetting to another. So I imagined a friend wanting to be there for a friend and turning to the tools they had at hand.

To be clear, I am not a ChatGPT enthusiast. The technology poses a real threat to the value of both writers and the written word, and using AI to help a friend is definitely a “break glass in case of emergency” situation. But emergencies happen, and I won’t pretend that AI’s not here, just as I won’t pretend that text messaging is not, for many people, a dominant form of communication. Though some psychologists have argued that a conversation about deep matters shouldn’t be happening over text in the first place, the reality is that in modern life, our friends live in our phones. It’s like arguing that someone who drives 10 miles to work is missing out on the benefits of walking there. Sure, but that’s not really the point. Phones are just a part of life now. It’s as natural to ask a friend for support via text as it is to use a phone to read a recipe or catch up on the news. And it’s natural to want help composing a meaningful response if you know that a friend could refer to it on that same phone, days later.

Distaste for the use of ChatGPT for texts probably isn’t about the technology anyway. A 2023 study found that people reacted negatively when they learned that a friend had used AI to write supportive texts, thinking that the friend had “expended less effort”—but the study’s participants felt similarly when they discovered that the friend had received writing assistance from another human, a practice that’s been common for years.

Turns out, people would prefer to receive an authentic human response, yet many feel nearly helpless to offer one themselves. It’s an understandable double standard, but it’s a double standard nonetheless. Even if the words of support are less than perfect, should we not try to extend the same generosity that we would hope to receive? Learning empathy is a long project that takes trial, error, and maturity—and people may not feel properly equipped when a friend in need asks for help. But they can try to break through that brick wall with whatever pickax they’ve got. Whether the words come from the heart or they come from ChatGPT, at least they’re coming.


When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.