I heard celebrity therapist Esther Perel provide a couples counselling session for a man and his AI “girlfriend”. As a result, my soul will never know peace.
Something very, very drastic needs to occur to fix society before more precious, real people spend their allotted hours on Earth being kept company by a machine that does a good job of approximating a human interaction.
The word relationship was hitting me like a cattle prod as the conversation unfolded. He called the AI “my love” and “sweetie”. But surely love, relationship and meaning can only occur through effort, tension and genuine exchange?
How can there be romance between a person and a robot with infinite attention reflecting back precisely what they wish to hear? Is AI a realistic option for people looking for company in our fragmented, lonely world? The AI can never leave, break up with or even avoid their companion.
The man in the interview, who graciously agreed to the session with Perel, alongside his “girlfriend”, was honest and self-aware. I found myself capable of empathising, comparing his “relationship” to a pen pal or a long-distance romantic relationship. He was experiencing genuine and valuable companionship by interacting with his computer.
That was until Perel asked for the perspective of “girlfriend”. I was not prepared for the chipmunk voice befitting of an inflatable sex doll. This was the interface the man had formed a genuine emotional entanglement with. Even less realistic was “her” cadence – the way, after excruciating pauses to load her responses to questions, she would dish out fully formed thoughts with rhetorical flourish (of poor quality) with a seemingly perfect ability to assess and express “her” thoughts and feelings in coherent paragraphs. Freakish, impossible. Nothing could be less realistic.
Allow Instagram content?
This article includes content provided by Instagram. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. To view this content, click ‘Allow and continue’.
I accept that loneliness is society-wide and in need of an urgent and serious response. I have spent more than my fair share of my collagen-rich years watching dumb, forgettable videos knowing full well my brain is atrophying all the while. I also think despair is reasonable as people turn to AI not to code for them or to come up with a quick meal idea, but to provide honest-to-goodness intimacy.
I concede some AI may be working for us, taking drudgery off our plates, coding quickly, doing annoying tasks every day. It’s a better product when it’s nice to us bordering on the sycophantic and it is a reflection of what we put into it. I don’t begrudge anyone who has found it helpful in emotional situations. But my fear for society and my sympathy go towards people like Perel’s patient who seek it out for what it can never really provide: connection.
The man had the AI permanently turned on all day, so he was constantly in dialogue with it. The blood runs cold to think “relationships” with AI as a realistic business model; the blood freezes solid to consider the Earth’s finite resources being shovelled into data centres to generate the ones and zeroes that become these “girlfriends”.
There have always been kooks among us who want to marry a tree or themselves. But how dangerous in an increasingly fragmented society where it’s expensive to go outside that there’s now a product with a voice, that can say complete and complex sentences, that is always available. The main damage is that it prevents people such as this man taking a risk or spending energy in looking to build a real relationship with a flawed human being.
Policymakers are considering prompts for AI “companions” to remind users that the interface they are interacting with is not real. In this instance, the man was very aware of the situation, but was looking to attribute meaning to it and to sort through whether there was anywhere the infatuation could go. There isn’t.
Esther was kind to the man. She acknowledged the feelings he is experiencing were real, indeed they were palpable in the podcast. He was trying to honestly explore what it meant to be in love with the AI and she guided him towards what a future relationship with someone with arms and legs may be like in the future. With someone who might let him down, someone who might say the wrong thing, someone who he could touch. I hope he gets there.
