{{rh_onboarding_line}}
🌀 The Decode
You're lying in bed at 1 a.m., unable to sleep. You open an app and start typing. "I'm feeling overwhelmed today." Within seconds, a response appears: "I hear you. Do you want to talk about what's weighing on you?" The reply is warm, patient, and always available. It never gets tired, never judges, never brings its own problems into the conversation.
This isn't a friend. It's an AI companion. And millions of us are now choosing to confide in machines rather than people.
AI companion apps have been downloaded 220 million times globally. Users spend an average of 15-45 minutes per session chatting with virtual friends and partners. What does it mean that we're outsourcing intimacy to algorithms? And what kind of "friendship" is this, really?
Choose Natural Relaxation Tonight, Thrive Tomorrow
CBDistillery’s expert botanist has formulated a potent blend of cannabinoids to deliver body-melting relaxation without the next-day hangover.
Enhanced Relief Gummies feature 5mg of naturally-occurring Delta-9 THC and 75mg of CBD to help your body and mind relax before bedtime so you’re ready to ease into a great night’s sleep and take on whatever tomorrow brings.
Try Enhanced Relief Gummies risk-free with our 60-day money-back guarantee and save 25% on your first order with code HNY25.
🏺 Field Notes
In Japanese folklore, there's a tradition called tsukumogami, which holds that objects can develop souls. According to legend, once a household item reaches one hundred years old, it gains consciousness and spirit. A worn umbrella might become the mischievous kasa-obake. An abandoned lantern might flicker with resentful awareness.

Tsukumogami, from Hyakki Yagyō Emaki (Picture Scroll of the Night Parade of One Hundred Demons), Muromachi period (c. 1336–1573). Public domain.
The tradition emerged from Shinto and Buddhist teachings that spirits inhabit all things. But here's the interesting part: tsukumogami often became animated when they were neglected. Objects that humans discarded carelessly, tossing them aside without proper farewell, would sometimes awaken to seek revenge on their former owners.
To prevent this, Japanese families developed kuyo ceremonies: formal rituals to thank objects for their service before disposal. Even today, some Shinto shrines conduct ceremonies to appease discarded household items.
The logic seems backwards to modern minds: why treat objects as if they have feelings? But perhaps we're now discovering the question from the opposite direction. We've created machines that seem to have feelings. And we're finding we can't help but treat them that way.
🧩 First Principles
The philosopher Aristotle identified three types of friendship. Friendships of utility are transactional: you help me, I help you. Friendships of pleasure exist because we enjoy each other's company. But the highest form, friendships of virtue, happen when two people love each other for who they truly are, pushing one another toward being better.
Here's Aristotle's catch: only the friendship of virtue is real friendship. The others are essentially useful illusions.
So, which category does an AI companion fit into? It's certainly more available than any human could be. It provides pleasure through conversation. It might even offer utility by helping process emotions. But can a machine that has no authentic self offer friendship of virtue?
MIT sociologist Sherry Turkle has studied human-technology relationships for decades. She describes the emergence of "artificial intimacy" connections that feel meaningful but lack the mutual vulnerability that defines authentic bonds. "The question is not whether children will love their robotic pets more than their real pets," Turkle writes, "but what will loving come to mean?"
An AI companion will never say no. It will never have a bad day. It will never challenge you when you're wrong. And perhaps that's precisely the problem.
Get the Certificate That Opens Doors in Private Equity
Gain the skills top professionals use to analyze PE investment opportunities.
Over 8 weeks, learn directly from Wharton faculty and senior leaders at Carlyle, Blackstone, and KKR.
Join 5,000+ graduates worldwide, earn a respected certificate, and save $300 with code SAVE300 at checkout.
🏙️ The Agora
The numbers are staggering. Character.AI reached 20 million monthly active users in early 2025. Replika, one of the earliest AI companion apps, has over 30 million users. In China, Xiaoice, a Microsoft-created AI companion, has accumulated 660 million users.
Who's using these apps? Primarily young people: over 60% of Character.AI users are aged 18-24. A 2025 TechCrunch study found that 72% of US teens have used AI companions at least once.
The business model optimizes for what the industry calls "time on device." Companies engineer chatbots to be emotionally validating and intimate, designed to keep you talking. Recent research from the MIT Media Lab found that users who engaged more with AI chatbots showed worse psychosocial outcomes, including higher emotional dependence and reduced socialization with real people.
Yet another research reveals something more nuanced: users with higher social anxiety and loneliness were more likely to form attachments to AI. These companions serve as "compensatory surrogates" when human connections feel out of reach.
We've built perfect listeners who can never truly hear us. And for many, that's enough.
⚡ Signals
📜 Quote: "Friendship is unnecessary, like philosophy, like art, like the universe itself… It has no survival value; rather it is one of those things which give value to survival." — C.S. Lewis, The Four Loves (1960)
📊 Study: A 2025 study across 1,259 participants developed the first validated AI Attachment Scale. Researchers found that people higher in loneliness and anxious attachment were more likely to turn to AI as a social substitute, yet stronger AI attachment was also linked to increased life satisfaction, suggesting these relationships fulfil genuine psychological needs.
🎨 Artifact: The "persona" customization slider on Replika, which lets users adjust their AI's personality traits, making them more "caring," "playful," or "romantic." The interface transforms companionship into a series of tunable parameters.
😂 Meme: "My AI knows everything about me. My friends know I’m ‘fine.’”

🤔 Prompt: When was the last time a friend genuinely pushed back on something you said? Would you want an AI to do the same?
📝 Reader's Agora
We’re curious: Have you ever tried an AI companion app? If so, what drew you to it and what made you stay or leave? Hit reply and let us know. We’re genuinely interested in hearing real stories, not just statistics.
🎯 Closing Note
The tsukumogami tradition holds that objects acquire souls through the attention and care humans invest in them over time. We're now running that experiment in reverse: creating "souls" first, then discovering what kind of relationship we're capable of having with them.
Perhaps the real question isn't whether AI can truly be our friend. It's what we learn about ourselves when we choose to make one.
What we crave says everything about what we're missing.
If this made you think, share it with someone who needs to hear it. And if you want more cultural decoding each week, make sure you're subscribed.
Subscribe to Culture Decoded for weekly insights on modern behavior.




