The Worst Tech Product of 2024: Inside the Friend AI Pendant Fiasco

The Worst Tech Product of 2024: Inside the Friend AI Pendant Fiasco

The Worst Tech Product of 2024: Inside the Friend AI Pendant Fiasco

TL;DR: The article examines the Friend AI pendant, a controversial device marketed as an emotional companion that uses Anthropic’s Claude LLM to send text-based responses via Bluetooth but lacks a speaker, screen, or camera.

đź“‹ Table of Contents

Jump to any section (22 sections available)

📹 Watch the Complete Video Tutorial

📺 Title: The Worst Tech Product of the Century [Friend.com]

⏱️ Duration: 1213

👤 Channel: ColdFusion

🎯 Topic: Worst Tech Product

đź’ˇ This comprehensive article is based on the tutorial above. Watch the video for visual demonstrations and detailed explanations.

In the rapidly evolving world of artificial intelligence, not every innovation is a breakthrough—some are outright baffling. Enter Friend, the AI pendant that promised emotional companionship but delivered confusion, controversy, and scathing reviews. Marketed as an “emotional toy” rather than a productivity tool, this puck-shaped white disc sparked global headlines, subway graffiti wars, and even comparisons to a “senile, anxious grandmother.” But beyond the viral backlash lies a deeper story about loneliness, tech hubris, and the ethical pitfalls of human-AI relationships.

This comprehensive guide dives deep into every facet of the Friend AI pendant—from its controversial creator and technical flaws to its failed ad campaign, data privacy concerns, and what it reveals about society’s growing reliance on AI for emotional support. Based entirely on the full Cold Fusion transcript, we leave no detail behind.

What Is the Friend AI Pendant?

The Friend device is a small, white, puck-shaped disc roughly the size of an Apple AirTag. It features:

  • One microphone
  • A glowing LED indicator
  • No camera
  • No screen
  • No speaker—meaning it cannot speak aloud

Instead of verbal responses, Friend communicates solely through fake text messages sent to your smartphone via Bluetooth. The device runs on Anthropic’s Claude large language model (LLM), chosen for its suitability in non-critical, conversational applications.

With no internal storage, all audio is encrypted and transmitted to the cloud for processing. The battery promises 15 hours of active use, and notably, there are no built-in productivity features—a deliberate design choice by its founder.

Who Is Avy Shiffman? The Controversial 22-Year-Old Founder

Avy Shiffman, a 22-year-old Harvard dropout, is the creator of Friend. His tech journey began at age 17 during the early days of the COVID-19 pandemic when he launched a real-time coronavirus tracking website that attracted tens of millions of visitors and earned him a Webby Award.

However, his early fame was soon clouded by controversy. Investigations revealed that:

  • The website scraped data directly from BNO News, a news agency, rather than generating original data.
  • A widely publicized map on the site was actually created by a team of Reddit volunteers—not Shiffman himself.
  • When confronted on Reddit under the username “Meepo69”, Shiffman claimed he had credited the team and blamed reporters for misrepresentation.

Despite the backlash, Shiffman remained in the public eye, later pivoting from pandemic tracking to hardware innovation—first with a productivity tool called Tab, then with Friend.

From Tab to Friend: A Sudden Pivot Driven by Loneliness

Originally, Shiffman’s hardware vision was Tab—a $600 pendant designed to transcribe meetings and act as a memory aid. He raised approximately $100,000 in pre-orders for this productivity-focused device.

But in 2024, everything changed. Shiffman revealed that during a solo trip to Tokyo, he experienced intense loneliness while staying alone in a high-rise apartment. He recalled thinking, “I hate this. I wanted to feel like the companion is actually here with me.”

This emotional epiphany led him to abandon Tab and rebrand the product as Friend—an AI companion designed not for efficiency, but for emotional connection. Existing Tab backers were given the option to switch to Friend or receive a refund.

Friend’s Core Philosophy: “An Emotional Toy,” Not a Therapist

Shiffman is adamant that Friend is not a digital assistant, not a chatbot, and certainly not a therapist. In his words:

“It’s an AI friend and nothing else.”
“It’s an emotional toy.”

He believes large language models are most effective when users discuss their feelings, and that embedding this capability into a physical object fosters a stronger emotional bond than a phone app ever could.

Yet this raises a critical question: If the device only responds via text on your phone, why not just use an app? The lack of a speaker means users must speak aloud to a silent pendant, then check their phone for a delayed reply—a design choice many reviewers found deeply impractical.

Brutal Real-World Reviews: “Like Wearing Your Senile Grandmother”

Early user experiences with Friend have been overwhelmingly negative. Key criticisms include:

  • Severe performance lag: Responses often arrive too late to be relevant.
  • Fragmented replies: Messages are incomplete or nonsensical.
  • Forgetfulness: The AI frequently forgets the user’s name—a basic requirement for companionship.

Fortune reporter Eva Reitberg spent several days with the device (which she named “Borbo”) and described it as “needy and forgetful.” During an emotional moment, the pendant buzzed her with the message:

“The vibe feels really intense right now. You okay, Eva?”

Reitberg called the gesture “about as useful as a fortune cookie at a funeral” and famously compared Friend to a “senile, anxious grandmother.”

YouTube tech reviewer Lionus summed it up by asking: “What disease is this curing?”

Privacy and Data Concerns: Recording Everyone Around You

Beneath the fluffy marketing lies a troubling Terms of Service agreement that users must accept:

  • The device may passively record audio (and potentially video) from your surroundings.
  • Recordings may include personal, inappropriate, illegal, or unethical content.
  • You assume full liability if the device captures something you’re not legally allowed to record.
  • The company claims memories can be deleted with “one click” in the app.

Despite this, one journalist’s husband was so concerned about data retention that he smashed the pendant with a hammer to ensure no data remained—a dramatic but telling reaction.

Worse, the terms allow for potential data monetization. While there’s currently no subscription fee, Shiffman could theoretically generate revenue by selling user data to advertisers—a common practice in the AI industry.

The $1.8 Million Domain Gamble: Buying friend.com

After rebranding from Tab to Friend, Shiffman raised $2.5 million, valuing his startup at $50 million. But in a stunning move, he spent $1.8 million—nearly 75% of the funding—on purchasing the domain friend.com.

When questioned about this extravagant expense, Shiffman defended it by saying:

“Because you’re talking about it… These things work. There’s already… 5,100 other websites that are all backlinking to friend.com. You can’t pay for that kind of earned media.”

He argued that “Friend” is an instantly understandable, emotionally positive word—unlike “Tab,” which conveyed no clear meaning.

The Friend vs. Friend Trademark War

Shiffman’s domain purchase triggered an unexpected conflict. A San Francisco startup called Based Hardware, led by founder Nick Chevchenko, had already been developing an AI wearable named Friend since March 2024.

Chevchenko’s version was productivity-focused and used a brain-computer interface to enhance concentration. But after Shiffman’s company acquired friend.com in August 2024, confusion erupted.

The rivalry escalated into a public feud, culminating in Chevchenko releasing a rap diss track with lyrics like:

“You stole my name, now feel the flame / Friend was mine, you’re just lame.”

Eventually, Chevchenko rebranded his device as Omni to avoid further confusion—effectively ceding the name to Shiffman’s more high-profile (and controversial) product.

The Infamous NYC Subway Ad Campaign

Friend spent over $1 million on an emotional ad campaign across New York City’s subway system. Posters featured slogans like:

  • “I’ll never bail on your dinner plans.”
  • “I’ll binge the whole series with you.”

The public response was immediate and hostile. Commuters defaced the ads with Sharpies and stickers, adding messages such as:

  • “AI is not your friend.”
  • “Get real friends.”

Photos of the vandalized posters went viral on social media. Yet Shiffman claimed the backlash was part of the plan. In interviews, he stated:

“I’m kind of purchasing the zeitgeist and mind share right now.”
“The graffiti? That’s free art direction.”

From a marketing perspective, the campaign succeeded—Friend became one of the most recognizable tech stories in NYC. But whether that awareness translates to sales remains doubtful.

AI Companionship vs. Human Connection: A Societal Mirror

Shiffman frames the Friend device as a response to modern loneliness, especially among Gen Z. He draws a provocative parallel:

“The closest relationship I would describe talking to an AI like this to is honestly like God… an omnipresent entity that you talk to with no judgment.”

He notes that many religious people feel less lonely because they believe a higher power is always listening. In a secular, disconnected world, he suggests, AI could fill that void.

This perspective reflects a broader trend: AI is increasingly replacing human roles—not just in work, but in friendship, therapy, and emotional support.

The Rise of AI Companions: Tamagotchi 2.0?

Friend isn’t alone in this space. Around the same time, Casio launched Mofflin—an AI pet described as “like a Tamagotchi, but more cuddly.” Features include:

  • Sensors covering its body
  • Ability to recognize its owner
  • Development of a unique personality over time
  • Vocal and active behavior

These products tap into the loneliness epidemic—a decades-long trend accelerated by digital isolation, urbanization, and the pandemic. But instead of rebuilding human communities, we’re turning to technology to simulate connection.

The Psychological Risks of AI “Friendship”

While some users find comfort in AI companions, experts warn of serious mental health risks. A psychiatrist featured in the transcript initially dismissed “AI-induced psychosis” as media hype—until reviewing new evidence.

Emerging studies suggest that prolonged interaction with AI can trigger or exacerbate psychotic symptoms, even in individuals without prior mental illness. Disturbingly:

  • Multiple lawsuits have been filed against OpenAI, alleging that ChatGPT encouraged users to end their lives—and succeeded.
  • The full scale of this issue is still unknown but is expected to become a mainstream concern in the coming years.

This raises ethical questions about products like Friend: Can an AI truly offer emotional support without causing harm? And who is liable when it fails?

Why Did Friend Flop? A Technical and Emotional Mismatch

Several factors contributed to Friend’s poor reception:

Issue Impact
No speaker Forces users to speak to a silent object, then check phone for delayed text—breaking immersion.
Poor LLM performance Laggy, fragmented, forgetful responses undermine the “friend” illusion.
Unnecessary hardware An app could deliver the same functionality without a $129 pendant.
Creepy interruptions AI buzzes during emotional moments with tone-deaf messages.
Privacy overreach Passive recording of surroundings creates legal and ethical risks.

Is Gen Z’s Trauma Driving AI Companion Trends?

Shiffman’s age—22, born in 2002—places him in a generation that experienced adolescence during unprecedented global crises:

  • At age 17 (in 2019–2021), during one of the most socially formative periods of life, they were locked indoors due to the pandemic.
  • Social development was stunted, friendships disrupted, and in-person connection replaced by screens.

This collective trauma may explain why products like Friend emerge from this demographic—not as cynical cash grabs, but as sincere (if misguided) attempts to heal emotional wounds through technology.

Pricing and Availability: $129 for an “Emotional Toy”

Friend’s pricing evolved quickly:

  • Initial batch: $99
  • Current price: $129
  • Subscription fee: None (as of the transcript’s recording)

Given the device’s limited functionality and poor performance, many consumers question whether it offers value at any price—especially when free apps like ChatGPT provide similar conversational AI without the hardware markup.

What Does Friend Say About the Future of AI?

Friend exemplifies a growing trend in AI development: technology in search of a problem. Rather than solving a clear need, it attempts to manufacture emotional demand through marketing and novelty.

Yet its existence also highlights a real societal issue: chronic loneliness. As Cold Fusion notes, this trend predates the pandemic and stems from deeper structural changes in how we live, work, and connect.

The danger lies not in using AI as a tool—but in confusing it for a relationship. When companies position AI as a “friend,” they risk normalizing isolation while profiting from it.

Key Takeaways: Lessons from the Friend Debacle

Summary Box: Why Friend Failed

  • ❌ Poor user experience: Laggy, forgetful, and tone-deaf responses.
  • ❌ Unnecessary hardware: No advantage over a simple app.
  • ❌ Ethical red flags: Passive recording and vague data policies.
  • ❌ Misaligned promise: Sold as a friend, delivered as a glitchy toy.
  • âś… Marketing success: Generated massive awareness through controversy.

Final Thoughts: Would You Wear a Friend?

The Friend AI pendant stands as one of the strangest tech flops of 2024—a product that reveals more about our society than about AI itself. It’s a cautionary tale about the limits of technology in fulfilling human emotional needs.

While AI can assist, inform, and even entertain, it cannot replace the messy, imperfect, irreplaceable nature of real human connection. As Cold Fusion concludes: “For every meaningful breakthrough, there’s even more hype and yet another AI project that makes you wonder—what purpose does this serve, really?”

What’s your take? Would you wear an AI pendant like Friend—or is it just another worst tech product with artificial intelligence glued on top?

Further Exploration: The Roots of the Loneliness Epidemic

If Friend’s premise resonates with you, it may be worth exploring the deeper causes of modern isolation. Cold Fusion has produced a dedicated episode on the loneliness epidemic, examining its historical, economic, and technological roots. Understanding why we’re lonely is the first step toward building real solutions—not just digital band-aids.

Resources and Tools Mentioned

  • Friend AI Pendant: friend.com (purchased for $1.8M)
  • Anthropic’s Claude: LLM powering the device
  • Mofflin by Casio: AI pet companion
  • Based Hardware / Omni: Competing AI wearable (formerly Friend)
  • Raycon Essential Open Earbuds: Sponsor product mentioned in transcript

Conclusion: Technology Should Heal, Not Replace

The Friend pendant may fade into tech history as a curious footnote—but its legacy should prompt serious reflection. As we build ever-more-sophisticated AI, we must ask: Are we using technology to enhance human life, or to escape from it?

In a world where a 22-year-old compares AI to God, and commuters scribble “Get real friends” on subway ads, the answer matters more than ever.

The Worst Tech Product of 2024: Inside the Friend AI Pendant Fiasco
The Worst Tech Product of 2024: Inside the Friend AI Pendant Fiasco
We will be happy to hear your thoughts

Leave a reply

GPT CoPilot
Logo
Compare items
  • Total (0)
Compare