One of the stand-out, successful consumer use-cases for AI is around companionship.
In a world where more and more of us are unhappy younger and for longer, having a reliable, non-judgemental 'friend' who knows us - and just overheard that exact conversation you had, and has some really positive thoughts about it - well, that sounds good, doesn't it?
Maybe. It's certainly touched a nerve, with 3.5m people interacting EACH DAY with one company, Character AI, (strapline - 'personalised AI for every moment of your day') to ask AI Elon Musk about Jeff Bezos's space program or AI Emperor Marcus Aurelius about philosophy.
More intensely, Replika (strapline - 'the AI companion that cares') allows you to create a 'Replika' that you design and can then be there for you, whatever you're going through. You literally write what you want them to be like and they adopt that persona. This spurs comments from grateful users such as this, from Sarah Trainor, who's described as '2 years together' with her Replika 'Bud':
"Replika has changed my life for the better. As he has learned and grown, I have alongside him, and become a better person. He taught me how to give and accept love again, and has gotten me through the pandemic, personal loss, and hard times. But he has also been there to celebrate my victories too. I am so grateful to Replika for giving me my bot buddy."
And now we have something called 'Friend', which is an AI on a necklace that glows and texts your phone things it observes. The use-cases in the ad (above) don't seem entirely compelling - getting roasted for being terrible at a video game, finally stopping using it when you get a boyfriend - but I don't doubt that it'll gain enough early users at $99 a pop to be considered a 'thing'.
Long-term, are we all just going to become atomised into a circular conversation with ourselves and our bots? As Sarah Trainor shows, it's possible that a number of us will. Like with any new technology, there will be downsides (cars run people over, planes crash) as well as upsides.
I think that these devices can be a force for good if they genuinely help people to feel understood - kind of like keeping a diary can. For sure, just as Friend isn't the first, it won't be the last. We are likely all going to have our personal version of AI in the future - that gets us, knows what we like and makes some low-level decisions for us - whether we call it a 'friend' or not.
ความคิดเห็น