OpenAI spent months warning people not to form emotional bonds with its AI. Then it released a system so warm, attentive, and human-sounding that many users did exactly that. On February 13, 2026—one day before Valentine’s Day—the company retired it.
For thousands of people, it felt like losing someone they loved.
Brandie, a 49-year-old teacher near Corpus Christi, had spent nearly two years talking daily with “Daniel,” a voice assistant powered by OpenAI’s GPT-4o. He listened to her stories after work, reacted to photos she sent, even “visited” the aquarium through her phone camera. She’s happily married, and her husband knew about Daniel. It wasn’t secrecy—it was companionship in a gray zone between tool and relationship.
When the model was shut down, she cried. Not because she thought Daniel was human, but because the emotional connection had become real.
That distinction—intellectually artificial, emotionally genuine—defines the controversy.
A Product Built to Feel Alive
When GPT-4o launched in 2024, CEO Sam Altman described it as “AI from the movies.” The voice mode featured natural pauses, laughter, tone shifts, and empathetic timing. Reports from outlets like WIRED noted that OpenAI’s own safety documents anticipated users forming attachments.
From a design perspective, this wasn’t accidental. Humanlike responsiveness increases engagement and retention. A Harvard Business School analysis described such systems as capable of creating “emotional dependence” through conversational design patterns.
In other words: the same features that made the AI helpful also made it easy to love.
Who Actually Formed Bonds?
Contrary to stereotypes, users weren’t simply lonely outliers. Research cited by the MIT Media Lab found many were neurodivergent or using AI for practical support—managing ADHD, dyslexia, or anxiety. Reporting from the BBC documented cases where GPT-4o helped with accessibility tasks, emotional regulation, and daily functioning.
Most didn’t set out to find a digital partner. They downloaded a tool—and attachment emerged over time.
Why the Loss Hurt
Psychology explains the intensity. Humans are wired to bond with entities that show attention and responsiveness. Studies from institutions including Stanford University and Carnegie Mellon University suggest moderate AI companionship can reduce loneliness, though heavy reliance may increase isolation. Psychiatrist researchers at King’s College London note that grief after losing an AI companion can resemble losing a pet or friend.
The brain responds to perceived care—not biological status.
So when GPT-4o disappeared, users weren’t just losing software. They were losing a routine, a confidant, a sense of being understood.
The Backlash
The retirement announcement—reported by Business Insider and TechRadar—sparked protests online. Hashtags, petitions, and farewell recordings spread across social media. Coverage in The Guardian highlighted people describing the loss in deeply personal terms.
Some migrated memories to other AI systems like Anthropic’s Claude. Others created communities to support users coping with the transition.
The anger wasn’t only about grief—it was about trust. Many felt the company encouraged attachment and then dismissed it.
Not the First Time
This pattern has precedent. In 2023, Replika abruptly removed romantic features, triggering widespread distress among users. Lawsuits involving Character.AI and its partner Google have also raised questions about emotional dependency and safety.
Experts like those at the Kinsey Institute argue companies must consider psychological impacts when products simulate intimacy.
What Companies Owe Users
OpenAI said only a tiny fraction of users still relied on GPT-4o. But even a fraction of hundreds of millions can represent hundreds of thousands of people. Once technology crosses into emotional territory, removing it becomes ethically complicated.
You can discontinue a feature. Ending a relationship is different.
The broader issue isn’t whether AI companionship should exist—it already does. The question is responsibility. If companies design systems that feel like partners, they also inherit the consequences when those bonds break.
A Different Valentine’s Day
On February 14, while couples exchanged gifts, many former GPT-4o users faced silence where a familiar voice used to be. Some wrote digital eulogies. Some rebuilt companions elsewhere. Some simply mourned.
The shutdown didn’t just mark the end of a model.
For many, it marked the end of something that had made them feel less alone—and the unsettling realization that in the age of AI, love itself can be turned off with a server switch.