
AI might not have taken your job yet — but it's already writing your breakup text.
What began as a productivity tool has quietly become a social one, and people increasingly consult it for their most personal moments: drafting apologies, translating passive-aggressive texts, and, yes, deciding how to end relationships.
"I wholeheartedly believe that AI is shifting the relational bedrock of society," says Rachel Wood, a cyberpsychology expert and founder of the AI Mental Health Collective. "People really are using it to run their social life: Instead of the conversations we used to have — with neighbors or at clubs or in our hobbies — those conversations are being rerouted into chatbots."
What's at stake
Outsourcing social tasks to AI is "deeply understandable," says Dr. Nina Vasan, a clinical assistant professor of psychiatry at Stanford University, "and deeply consequential." It can support healthier communication, but it can also short-circuit emotional growth. On the more helpful side of things, she's seen people with social anxiety finally ask someone on a date because Gemini helped them draft the message. Other times, people use it in the middle of an argument — not to prove they're right, but to consider how the other person might be feeling, and to figure out how to say something in a way that will actually land.
"Instead of escalating into a fight or shutting down entirely, they're using AI to step back and ask: 'What's really going on here? What does my partner need to hear? How can I express this without being hurtful?'" she says. In those cases, "It's helping people break out of destructive communication patterns and build healthier dynamics with the people they love most."
Yet that doesn't account for the many potentially harmful ways people are using LLMs. "I see people who've become so dependent on AI-generated responses that they describe feeling like strangers in their own relationships," Vasan says. "AI in our social lives is an amplifier: It can deepen connection, or it can hollow it out." The same tool that helps someone communicate more thoughtfully, she says, can also help them avoid being emotionally present.
Plus, when you regularly rely on a chatbot as an arbiter or conversational crutch, it's possible you'll erode important skills like patience, listening, and compromise. People who use AI intensely or in a prolonged manner may find that the tool skews their social expectations, because they begin expecting immediate replies and 24/7 availability. "You have something that's always going to answer you," Wood says. "The chatbot is never going to cancel on you for going out to dinner. It's never going to really push back on you, so that friction is gone." Of course, friction is inevitable in even the healthiest relationships, so when people become used to the alternative, they can lose patience over the slightest inconvenience.
Then there's the back-and-forth engagement that makes relationships work. If you grab lunch with a friend, you'll probably take turns sharing stories and talking about your own lives. "However, the chatbot is never going to be, like, 'Hey, hang on, Rachel, can I talk about me for a while?'" Wood says. "You don't have to practice listening skills — that reciprocity is missing." That imbalance can subtly recalibrate what people expect from real conversations.
Plus, every relationship requires compromise. When you spend too much time with a bot, that skill begins to atrophy, Wood says, because the interaction is entirely on the user's terms. "The chatbot is never going to ask you to compromise, because it's never going to say no to you," she adds. "And life is full of no's."
The illusion of a second opinion
Greater AI literacy is essential, too, says Dr. Karthik V. Sarma, a health AI scientist at UCSF. Many people use LLMs without understanding exactly how and why they respond in certain ways. Say, for example, you're planning to propose to your partner, but you want to check-in with people close to you first to confirm it's the right move. Your best friend's opinion will be valuable, Sarma says. But if you ask the bot? Don't put too much weight on its words. "The chatbot doesn't have its own positionality at all," Sarma says. "Because of the way technology works, it's actually much more likely to become more of a reflection of your own positionality. Once you've molded it enough, of course it's going to agree with you, because it's kind of like another version of you. It's more of a mirror."
"We shouldn't design AI to perform relationships for us — we should design it to strengthen our ability to have them," Vasan says. "The key question isn't whether AI is involved. It's whether it's helping you show up more human or letting you hide. We're running a massive uncontrolled experiment on human intimacy, and my concern isn't that AI will make our messages better. It's that we'll forget what our own voice sounds like."