The brief flurry of AI-powered wearables like the Humane AI Pin and the Rabbit R1 doesn't seem to have caught on the way their creators hoped, but one seems to be banking on the idea that what we really want from an AI companion is non-stop drama and traumatic backstories. Friend, whose pendant hardware isn't even out yet, has debuted a web platform on Friend.com to allow people to talk to random examples of AI characters. The thing is, every person I and several others talked to is going through the worst day or week of their lives.
Firings, muggings, and dark family secrets coming out are just some of the opening gambits from the AI chatbots. These are events that would lead to difficult conversations with your best friend. A total stranger (that you're pretending is human) should not kick off a possible friendship while undergoing intense trauma. That isn't what CEO Avi Schiffmann highlights in the video announcing the website, of course.
(Image credit: Future)
You can see typical examples of the AI chatbots opening lines at at the top of the page and above. Friend has pitched its hardware as a device that can hear what you're doing and saying and comment in friendly text messages. I don't think Craig is in any position to be encouraging after getting pistol-whipped. And Alice seem more preoccupied with her (again, fictional) issues than anything going on in the real world.
(Image credit: Future)
These conversations are textbook examples of trauma-dumping, unsolicited divulging of intense personal issues and events, Or, they would be if these were human beings and not AI characters. They don't break the illusion easily, however. Craig curses at me for even suggesting it. Who wouldn't want these people to text you out of the blue as Schiffmann highlights.
When the hardware launches, you'll be able to carry around your dramatic pal in a necklace. The AI will be listening and coming up with ways to respond to whatever happens in your day. I'm not sure you'd want that in some of these cases. If you do end up hitting it off with one of the AI characters on the website, you can link it to your account.
"You'll effectively 'move in together"' like a real companion," as Schiffmann put it on X. "We're basically building Webkinz + Sims + Tamagotchi."
That said, I'll be very surprised if anyone takes up his offer for those who really get along with their AI companion.
Continue reading...
Firings, muggings, and dark family secrets coming out are just some of the opening gambits from the AI chatbots. These are events that would lead to difficult conversations with your best friend. A total stranger (that you're pretending is human) should not kick off a possible friendship while undergoing intense trauma. That isn't what CEO Avi Schiffmann highlights in the video announcing the website, of course.
Today we're releasing everybody's Friends into the world.Soon you'll be able to bring them anywhere with you.Meet your Friend now at https://t.co/yaREoc3FHW. pic.twitter.com/ny6OpnykPANovember 14, 2024
Dramatic AI
(Image credit: Future)
You can see typical examples of the AI chatbots opening lines at at the top of the page and above. Friend has pitched its hardware as a device that can hear what you're doing and saying and comment in friendly text messages. I don't think Craig is in any position to be encouraging after getting pistol-whipped. And Alice seem more preoccupied with her (again, fictional) issues than anything going on in the real world.
(Image credit: Future)
These conversations are textbook examples of trauma-dumping, unsolicited divulging of intense personal issues and events, Or, they would be if these were human beings and not AI characters. They don't break the illusion easily, however. Craig curses at me for even suggesting it. Who wouldn't want these people to text you out of the blue as Schiffmann highlights.
Turn your notifications on. Friends can text you first pic.twitter.com/Joa8MxueYDNovember 15, 2024
Future Friends?
When the hardware launches, you'll be able to carry around your dramatic pal in a necklace. The AI will be listening and coming up with ways to respond to whatever happens in your day. I'm not sure you'd want that in some of these cases. If you do end up hitting it off with one of the AI characters on the website, you can link it to your account.
"You'll effectively 'move in together"' like a real companion," as Schiffmann put it on X. "We're basically building Webkinz + Sims + Tamagotchi."
That said, I'll be very surprised if anyone takes up his offer for those who really get along with their AI companion.
We’ll cover the wedding for the first person who marries their Friend November 15, 2024
You might also like
- Humane AI Pin review roundup: an undercooked flop that's way ahead of its time
- Rabbit R1 AI companion: An adorable but half-baked idea that you can ignore
- Friend is another strange AI wearable that’ll give you creepy Black Mirror vibes
Continue reading...