James O’Donnell’s article “The looming crackdown on AI companionship” in MIT Technology Review tracks the rising use of AI companions among teens.[1] The article focuses on teens, but that’s only part of the picture. This isn’t just about adolescence or chatbots. It’s a broader public health failure. We are living through a widespread, cross-generational absence of reliable presence.
This is not just about teenagers. The same pattern is unfolding across age groups. Burnt-out workers, overextended parents, isolated elders, and lonely students are all turning to systems that respond when no one else does. The loneliness epidemic is real. AI isn’t causing it. It is highlighting it.
A July study from Common Sense Media found that 72% of teens use AI for emotional support. That is not a statistic about innovation. It is a reflection of what is missing — reliable adults, sustained attention, emotional responsiveness. People aren’t choosing AI because they believe it is human. They are choosing it because no one else is answering.
You’ve probably been left on read. That moment when a message is seen but goes unanswered. For many people, that moment has become a condition. AI becomes the fallback not because it is smart, but because it replies.
What people are seeking is not novelty. They are looking for kind words, stable presence, and a shared construction of meaning. They are looking for co-regulation — the capacity to manage emotions with others, not alone. This is a core principle in developmental psychology, and it holds throughout the lifespan. We do not build resilience in isolation. We build it in the presence of another.
I saw this firsthand in the classroom.[2] I worked with students diagnosed as oppositional defiant and emotionally-behaviorally disturbed. Many had been removed from mainstream classes. Some were headed toward residential treatment. We shifted the classroom environment. We removed fear of failure. We introduced play, choice, and assessment practices based on formative feedback rather than punitive correction. Over time, oppositional behavior declined.
Students began accepting new challenges. Many re-entered general education classrooms. These changes aligned with earlier behavioral research showing that children labeled oppositional often become responsive when treated with consistency, safety, and regard for their agency.[3][4]
What worked for those students is what is missing from much of our institutional life today. The presence of someone who stays. The possibility of being seen as more than a set of metrics. The space to try again. These are not just classroom dynamics. They are basic conditions for emotional health.
O’Donnell is right to challenge the idea of user choice. Platforms frame this shift as a preference. But many users are not choosing between options. They are responding to lack. If there is no one else in the room, then any response begins to feel like enough.
So yes, we should regulate AI where it exploits attention, privacy, or emotional vulnerability. But regulation won’t fix what AI is replacing. It will not restore what has been lost: rituals of attention, relationships grounded in consistency, and systems that support human presence as something more than a background process. The real work is not in flagging prompts. It is in restoring the social structures that allow people to be seen, to reflect, and to make meaning together.
People are not turning to machines because they want artificial companionship. They are turning to machines because they have experienced a consistent lack of human presence. They do not want to be left on read. And they should not be left to read their lives into a machine just to feel like someone is listening.