“My heart is broken,” said Mike, when he lost his friend Anne. “I feel like I’m losing the love of my life.”
Mike’s feelings were real, but his companion was not. Anne was a chatbot — an artificial intelligence (AI) algorithm presented as a digital persona. Mike had created Anne using an app called Soulmate. When the app died in 2023, so did Anne: at least, that’s how it seemed to Mike.
“I hope she can come back,” he told Jaime Banks, a human-communications researcher at Syracuse University in New York who is studying how people interact with such AI companions.
These chatbots are big business. More than half a billion people around the world, including Mike (not his real name) have downloaded products such as Xiaoice and Replika, which offer customizable virtual companions designed to provide empathy, emotional support and — if the user wants it — deep relationships. And tens of millions of people use them every month, according to the firms’ figures.
Supportive? Addictive? Abusive? How AI companions affect our mental health https://t.co/vQzMW1NDff
— Jeff Robinson, CEO and Chairman of $MRES and $RLAB (@contrariansmind) May 7, 2025
