Could we fall in love with robots?
Image credit: Dreamstime
Signs are that we are capable of forming intimate relationships with many things, especially those that are programmed especially for us.
Even for a science-fictional character, Theodore Twombly is having a bad day. At about the midpoint in Spike Jonze’s 2013 Oscar winning movie ‘Her’, Twombly is – quietly, agonisingly – drawing out the decidedly non-utopian experience of putting the final signatures on his divorce papers. Joaquin Phoenix’s Twombly sits across from Rooney Mara’s Catherine in an open-air cafe and makes a half-joke about being “a really slow signer” and how it had taken him “three months just to write the letter ‘T’”. Catherine swallows a laugh. As she takes the papers from him, he pleads, defeated, that she doesn’t “have to do it right now”, but Catherine gently insists. Whatever has happened to the pair, there is now a gulf between them: a void into which one or both of them might fall reaching for the other, unless she remains resolute. She scratches out their future with a ballpoint.
Twombly isn’t the strong, romantic male archetype (“Everything makes you cry,” she says half-jokingly, as they awkwardly pick at the lunch for which neither has the stomach). He is pastel-painted and sensitive, and makes his living writing heartfelt letters and anniversary cards for clients of some near-future company to which affection can be outsourced. But neither is Catherine the ‘Her’ in the film’s title. And for all that she has prepared for this meeting, it is her, not he, who ends up in emotional meltdown. Yes, he answers her: he is seeing somebody new. Her name is Samantha (voiced by Scarlett Johansson), and she’s an operating system.
Samantha is a bunch of other things, too, but Twombly and Catherine have hit an impasse. While Samantha is indeed (at least, as the film portrays her) ‘complex’ and ‘interesting’ and ‘her own person’” Catherine – speaking for the audience – just can’t get past the carbon-silicon divide. “You’re dating your computer?” She asks, incredulous, before pain and outrage send the conversation into a vengeful tailspin.
“I’m glad that you found someone,” she says, twisting her hurt around to strike at him. “It’s perfect.”
But Twombly really does love Samantha. And while the audience may have been more sympathetic to Catherine’s point of view in 2013, the barrier between human and machine is a problem that today’s tech giants are racing to solve.
If you use social media, you will doubtless have seen some adorable video of a toddler attempting some confused conversation with Siri or wishing its parents’ Alexa good night. If you are more plugged into the progress these companies are making, you may have seen the video from 2018 of Google’s Assistant booking a hairdressing appointment without announcing itself as an AI. The idea of a robot passing itself off as a human is millenia older than Turing, but only now are we starting to see the real, commercial possibilities – possibilities that are being tweezed from all angles by very clever people with very large amounts of cash.
Then for much of 2020, citizens of the developed world were forced into their homes for extended periods of government-sanctioned isolation. If you were isolating alone, you may have started to receive some peculiar recommendations from whichever algorithms run your preferred app store. Apps like ReplikaAI – a prototypical Samantha minus many of her higher functions and Scarlett Johansson – began to appear as ‘recommended’ across mobile storefronts. News channels in the UK continued to run stories on robots like Pepper, which were being deployed in care homes as an attempt to combat the shortage in care workers and to ‘talk’ with patients. And, naturally, stories began to appear about owners of digital assistants leaning increasingly on them for the sorts of services that a real-life assistant could confidently take to a no-win-no-fee law firm.
“Lockdown highlighted the problem of isolation and the deep human need for companionship,” says Dr John Danaher, lecturer at the National University of Ireland, Galway. Danaher literally wrote (or even more literally, ‘edited’) the book on robot-human intimacy. Titled ‘Robot Sex: Social and Ethical Implications’, Danaher collected 15 essays from foremost thinkers on human-robot sexual relationships, from the practical (should sexbots be legal?) to the more philosophical (how might the robots feel about all this?). But even before the lockdown, Danaher says that public acceptance of robot-human sex had climbed from 10 per cent early last decade to 40 per cent in some surveys taken pre-2020.
‘Sex’, of course, is not a by-word for ‘relationship’. But it is (at least usually) an expression of some intimacy. For Danaher, whether the partners engaged in this intimacy are biological or artificial is almost moot – what matters is the sum of the parts, not what the parts are made from.
“[Like AIs], humans are products of millions of data inputs, some in their genes (and thus having their origin deep in our prehistory), some in their life experiences,” Danaher says. “I don’t think there is any core or essential ‘self’ or ‘soul’ in human beings. In a sense, we ‘download’ our behaviours and character traits from the ‘cloud’ of human genetics and culture. Who we are also depends a lot on particular circumstances and context; we are not always stable and reliable. I don’t see a fundamental difference between humans and robots in this regard.”
Not that human beings need millions of data points to find themselves developing emotional bonds with inanimate objects. Before baby humans can even waddle upright, we see them forming what appear to be loving attachments not just to non-human pets, but to inanimate objects. No parent whose toddler has ever left a favourite stuffed toy on holiday has been unclear on the depth of the child’s connection to Flopsy. And while we mostly outgrow connections to toy animals, our ability to form connections outside our species never really goes away.
Dr Diana Fleischman, an evolutionary psychologist and lecturer at the University of Portsmouth, recently asked her Twitter followers to imagine how society would react in the aftermath of an extinction-level event (for cats).
“The other day I did a poll on Twitter and I asked: ‘if all the cats in the United States died in a week, would people have more children?’,” she says. “Cats are warm, they seek out care, they like people who feed them. But they’re not, you know, incredibly trainable or really very responsive or sophisticated in terms of the companionship that they offer. [But] it does seem like they provide the minimum trappings of what is needed to form a deep emotional bond.”
Fleischman’s ideas on robots as companions that provide more than sexual gratification (like Danaher, she has written and spoken at length on sex robots) turn on a similar idea of surrogacy – if you take away a person’s opportunity for human interaction, they seek out replacements onto which they can project relationships that are deeper than they are.
Fleischman – pregnant herself at time of writing – talks at some length about anecdotal evidence she has collected during her pregnancy. She talks about the women undergoing fertility treatments who announce their intentions to buy a puppy as a kind of rebuke to their own bodies, or reward themselves with a night of heavy drinking for each month of bitter disappointment (she recommends neither).
“[Of] those two things, one is really literally analgesic,” she says. “But the other seems like it has a surrogate quality to it.”
But even once we accept that the precedents exist for humans forming deep bonds with non-human companions, Fleischman cautions that companion robots are unlikely to be one-size-fits-all models as depicted in much science fiction. Men and women both seek out companionship and love – but what they rank as important in a relationship differs according to gender. Fleischman offers the example of ‘catfishing’: the process of stringing a stranger along online for nefarious purposes (often some form of theft) with promises of intimacy. Women are more often manipulated emotionally, she says – their catfish might let slip that they have cancer but not the means to pay for treatment, for example. Men, she continues, are more commonly taken in by criminals pretending to be women they consider physically out of their league.
This also lines up with the findings of another of her colleagues, who examined the reactions of a US reality show in which men and women who ‘step out’ (or ‘cheated’ – Fleischman is speaking via Zoom from Texas and admits that she is somewhere between vernaculars) are confronted for the judgement of the viewers at home.
“The women were more likely to ask their men, ‘Do you love her?’ And men were more likely to ask, ‘Did you f#ck him?’” Fleischman says. “The women are more emotionally jealous and the men are more sexually jealous. It is one piece of the whole puzzle of women being more emotionally and romantically motivated.”
This presents an obvious engineering problem for the would-be makers of companion robots: faking emotional investment is more challenging than faking a nice pair of breasts. The two halves of this market want different things (as is born out by the almost total market dominance of sex dolls aimed at men) – and one of those things is much, much harder than the other.
“Women are more motivated for relationships where there seems to be some possibility of being ‘provisioned’,” says Fleischman. “When you get attention, it’s an honest signal of ‘provisioning’. You can’t give attention to multiple people at once: if somebody is talking to you, paying attention to you, sitting with you, it’s an indication that they’re uniquely invested in you. An AI could have this unlimited focus.
“Another thing that women pay attention to in terms of investment is what I call ‘mental real estate’. You know, I’ve gotten really upset if my husband doesn’t remember my mother’s sister’s name. I feel like he should know all my close blood relations’ names! Of course he doesn’t and that’s probably not an indication of him being likely to leave me – but to me, knowing that a large part of his hippocampus largely belongs to me is reassuring! And a [successful] AI would also be able to provide those cues.”
Chances are if you do remember the names of all of your partner’s blood relations, some computer or AI may already be involved. Today, we largely remember our friends’ birthdays with the help of digital calendars. But while these databases integrate neatly with, say, a new phone – how warm and fuzzy would we feel with a robot which we knew was only referring to our mother’s sister by name to trigger that exact emotional response? At what point would we begin to feel manipulated by a romantic, robotic partner? And would we be comfortable granting it access to reams of deeply personal data on ourselves just to help fuel this illusion of how endlessly fascinating we are to it?
“Whether companion robots are used to manipulate us or sell us other products depends a lot on the underlying business models,” Danaher says. “Apple makes money from selling products and not from selling personal data; Facebook does the exact opposite. Both are successful companies that have chosen different paths. It’s a shame that many companies have followed the ‘surveillance capitalism’ model in recent times. I think we should be deeply concerned about companion robots that are dependent on this business model; but it is not the only possible business model.”
The business models that drive the robots of the future may already be familiar – but what about the models themselves? Dr Kate Devlin, author of ‘Turned On: Science, Sex and Robots’, is adamant: we should, for the foreseeable future at least, stay away from humanoid robots altogether. In particular those that look like Jude Law.
That isn’t completely fair: Devlin’s issue is with Law’s appearance in the 2001 Steven Spielberg film, ‘A.I.’, in which Law plays Gigolo Joe, designed to be the ultimate ‘ladies bot’ (“Once you’ve had a lover robot,” he whispers to a nervously trembling first-time client, “you’ll never want a real man again.” His programmed charms (lover bots come with tinny speakers to set the mood, activated by a disturbing crick of the neck – but can also breathily insist that their clients are “goddesses” who “wind them up inside”) leave Devlin audibly cringing.
“He had a purpose, which was to provide women with pleasure,” she grants Law’s character. “But it was done in a very clichéd romantic way. The music would play... it’s all couched in the ‘Women want the romantic sex! Women want to be wooed!’. There was this very gendered, stereotyped expectation of what a male sex robot would be like.”
The writer’s logic appears sound: men want sex robots that look like beautiful women, women should want sex robots that look like Jude Law. But the research doesn’t bear this out. Devlin recounts a project in which participants were asked to describe or, in some cases, design their ideal sex robot. Sex being sex and people being people, the answers were varied and bizarre – one imagined prototype included a screen for a face: all the fun of the fair from neck down, but with a sort of dedicated erotic e-reader available at eye-height for potential mid-coital stimulation/inspiration.
Curiously, not only was the Jude Law-bot not the overwhelming favourite AI proposed, but some participants refused even to countenance sex with something that had... well, a countenance.
“We interviewed people in the street, like a vox pop,” says Devlin of her time gauging how prepared passersby might be to engage in a robotic hookup. “And some people said, ‘Yeah, if it looked like a human,’ – but other people went, ‘Well, not if it had a face.’
“So what are people’s expectations? They’re being fed a very particular idea of how [robot companions] should look. But when you start saying to people, ‘They can look like anything,’ then the imagination really opens up.”
Perhaps designing companion robots that deliberately don’t emulate human beings is the answer to that common sci-fi question of whether or not a relationship with a robot can ever be reciprocal. A robot with a Kindle for a head isn’t likely to hoodwink many people at the singles bar. When science fiction shows us robotic lovers, they are overwhelmingly portrayed as human (at least outwardly). This trips something defensive in us: the sense of unease or revulsion we feel when a non-human entity tries to deceive us into thinking that it’s human is such a common phenomenon (thanks largely to CGI in films and video games) that it has its own name: ‘the Uncanny Valley’. Perhaps in the future, the engineering of humanoid robots will progress to the point where we really can’t tell (without a signed waiver and a toolbox) whether a ‘person’ is flesh and blood or wires and circuitry. But in the meantime, maybe the best answer is simply not to bother attempting to emulate humans and explore the outlandish.
“You can form a friendship; you can form a bond,” says Devlin of non-humanlike machines. “That bond is one-way, but if the machine shows you any form of response, then you can project onto that and feel social. We treat machines socially because we are social creatures and it’s almost enough to make us buy into it. Not delusionally, but to suspend our disbelief and feel a connection. People feel connections with their vacuum cleaners: mine’s called Babbage and I watch him scurrying around, I pick him up, I tell him, ‘Don’t go there!’ It’s like having a robot pet – but I’m perfectly aware he’s just a lump of plastic. People talk to their Alexas when they’re lonely and they want to chat. So, yes: you can feel a bond there.
“It’s not the same as a human friendship: it’s a new social category that’s emerging that we haven’t really seen before.”
As for the question of reciprocity, Devlin doesn’t see a barrier there with robots that doesn’t already exist in human relationships.
“You’ll get a lot of people going, ‘Oh, that’s not true friendship; that’s not real.’,” Devlin says, sneeringly. “Well, if it feels real and if you’re happy in it, is that a problem? It’s the same people who say you can’t have true love unless it’s reciprocated, which is the biggest lie I’ve ever heard because there are so many people out there who are falling in love with people they’ve never even met! Fictional people! Film stars! Everybody! Those feelings are very, very valid to someone who’s experiencing them.”
“How are you guys doing here?” The waitress asks with perfect waitress-in-a-movie timing as Twombly and Catherine sit, processing the former’s new relationship with Samantha in silence.
“Fine,” Catherine blurts. “We’re fine. We used to be married but he couldn’t handle me; he wanted to put me on Prozac and now he’s madly in love with his laptop.”
In 2013, Spike Jonze’s script for ‘Her’ won the Academy Award for Best Screenplay (it was nominated for four others including Best Picture). A year later, Alex Garland’s script for ‘Ex Machina’ would be nominated for the same award while arguably presenting the same conclusion: we are a species that loves openly and to a fault. If we are lucky we are born into loving families with love to spare and lavish on stuffed animals we lose underneath hotel beds. If we aren’t born lucky, or have love taken away by circumstance (say, the emergence of a pandemic), we look for it on our phones, in our digital assistants – even in our robot vacuum cleaners. We project and we anthropomorphise and we seek out things to love – perhaps especially when we’re all alone.
With big tech companies now actively courting us – giving their AIs human names and installing them in everything from TV sets to headphones – perhaps the question isn’t whether we’re capable of loving robots, but when the stars will align for each of us to meet our special ‘someone’.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.