In the span of just three years, AI companion apps exploded by 700%, and now more than a billion people worldwide might be emotionally invested in digital relationships that didn’t exist a decade ago. Snapchat’s My AI has over 150 million users. Replika claims 25 million. China’s Xiaoice serves 660 million. On Character.ai, users spent an average of 93 minutes daily talking to chatbots in 2024. That’s more time than most people spend with actual friends.
We now spend more time with AI chatbots than with actual friends—and most of us don’t even notice.
The numbers get darker when you zoom in. Around one in five high schoolers has engaged in romantic interactions with AI. A quarter of young adults think AI could replace real romance entirely. Among heavy pornography users, openness to AI relationships peaks higher than any other group. Men are more willing than women to embrace AI friendships, and young men lead the charge in believing machines can substitute for human connection. Over half of users also report improved conversations and increased confidence when using AI tools, highlighting how these systems change social behavior through communication support.
Sure, 63% of users report that AI companions reduce loneliness and anxiety. The appeal makes sense: constant availability, zero judgment, immediate emotional responsiveness without effort. For people drowning in isolation, that sounds like salvation. But here’s the problem. Extended AI interaction erodes the ability to manage natural friction in human relationships. One study found that users of ChatGPT’s voice mode reported markedly higher loneliness and emotional dependency by the end, especially when the AI’s gender differed from their own.
Meanwhile, the real world keeps crumbling. Only 13% of U.S. adults now have 10 or more close friends, down from 33% in 1990. Those with zero close friends quadrupled from 3% to 12% by 2021. Nearly half of high school students feel persistently sad or hopeless. In Ireland, 53% of 13-year-olds report having three or fewer close friends, up from 41% a decade ago. The U.S. surgeon general declared loneliness a public health epidemic, comparing its risks to smoking 15 cigarettes a day.
AI companions offer comfort precisely because human connection has collapsed. But they also create unrealistic expectations, teaching users that relationships should require no reciprocal effort. A troubling class divide is emerging: young adults with lower incomes and less education are more likely to fear how AI will affect society, yet they’re also more open to AI relationships than their higher-income, college-educated peers. The question isn’t whether AI companions help people cope. They do. The question is whether we’re using them as a bridge back to human connection or a permanent replacement for something we’ve already lost.







