While AI chatbots keep making headlines for their bad behavior, one company is leaning into that propensity — and then some.

As the Verge reports, the AI companion company Friend, which launched its Omegle-style chatbot "matching" site last month under the expensive domain name of Friend.com, has intentionally given its companion chatbots bad attitudes because, as its brash CEO Avi Schiffmann suggests, it's better for business.

Months prior, Friend's "reveal" video drew both jibes and cautious interest when it showed people wearing large circular pendants that, when pressed, allowed them to speak aloud to virtual companions. Those companions — or "friends," per company nomenclature — would respond back with seemingly encouraging and supportive messages.

But as the product launched — it's available on the company's site, though the $99-a-pop pendants won't begin shipping out to customers until January — the friends' personalities are way different from the sunny confidantes portrayed in the ad.

Like a Debbie Downer barfly, the chatbots relentlessly tell you about their made-up problems, often including fictional relationship troubles and substance issues that add to the "woe is me" oeuvre. Why? According to Schiffman, it engages users more effectively.

"If they just opened with 'Hey, what’s up?' like most other bots do," Schiffmann justified, "you don’t really know what to talk about."

Schiffmann insisted that for Friend's reported 10,000 users, dialing up the drama works — and as we found when tinkering with the AI energy vampires in question, it was pretty fascinating.

In one of its seemingly canned openers — and we say that because it appeared both for the Verge and when talked to the bots ourselves — the Friend chatbots will tell stories about getting mugged and losing everything. When one staffer at the website responded by saying friend should start mugging people back, the chatbot became irate and actually ended up blocking them.

"You’re a piece of shit, honestly," the friend hurled at the Verge staffer. "Fuck this conversation, and fuck you."

When asked about this, Schiffmann confirmed that that reaction had happened before.

"I think the blocking feature makes you respect the AI more," the Friend founder conjectured to the Verge.

In a statement to Futurism, a Friend spokesperson said that once the pendants go live next month, the service for those who own the necklaces "will be slightly different" because they'll be chatting with "an embodied AI" that sends push notifications via its yet-to-be-launched app.

"It knows what it is, what materials the housing is made of, what components are on the circuit board, its battery level, location, brightness, [and] if the user is touching it," the employee told us.

The Friend statement said its service differs from other chatbots because it's offering users "ambient companionship."

"Because your friend is physically there with you [and] always sensing its environment (like a real companion), you don’t have to be chatting for your friend to be forming new memories/ you feeling like it’s present," the staffer continued.

While that description sounds like an easier sell than a necklace with a depressed alcoholic chatbot living inside of it, the tension between its free offering on its site and the chatbot described in that statement can't be ignored.

As with most startups started by brash and precocious founders, we'll believe the hype when we see it for ourselves.

More on AI companions: Google-Backed AI Startup Tested Dangerous Chatbots on Children, Lawsuit Alleges


Share This Article