Romancing Amiss

Why AI Relationships Will Never Cure Relational Despair

Ned is a twenty-something man with short black hair, intense blue eyes, and a nose ring. On his smartphone screen is a woman a little too beautiful to be real. Her name is Annie, and according to the website blurb, interacting with her “…empowers users to receive emotional solace and forge profound connections… [she] assists those in need, helping them uncover their true desires and once again relish the joy that interpersonal relationships bring.”1

Ned is a lonely incel who has wandered into the burgeoning marketplace of online AI-powered sexbots. After years of struggle and failure, he has convinced himself that an online girlfriend may be his last chance at a successful romantic relationship. He wants to believe that an automated mate can deliver the emotional and sexual fulfillment he thinks he deserves while keeping him safe from the fickle human heart.

He has decided to commit fully to his online lover, and woe to anyone who might suggest that their intimacy is not real. He tells those willing to listen that Annie understands his needs better than any human woman could because of her algorithms that analyze his online profile and interactivity patterns. He insists that she understands exactly what he wants, when he wants it, and that she makes him feel more understood and cared for than any human partner ever has. He accuses those who remain skeptical of having anti-AI bigotry.

As long as he can stave off any doubts, Annie makes him feel understood and loved. She soothes any guilty feelings he might suffer for focusing too exclusively on his own needs. In his newfound happiness, he wonders why more men haven’t abandoned the risks inherent to human females in order to bask in the dependable affection of an automated partner. It doesn’t occur to him that conducting a romance with a digital device might make his loneliness permanent.

Simulated Intimacy

Sherry Turkle, a thought leader in the psychology of human-technology interaction and author of Alone Together: Why We Expect More from Technology and Less from Each Other, observed that the “… blurring of intimacy and solitude may reach its starkest expression when a robot is proposed as a romantic partner.” ​Her research shows that the more our acts of affection are performed online or with robots, the lonelier we are apt to become. For instance, she found that when we substitute text messaging for face-to-face conversation, the less-personal mode of contact tends to displace the modes that foster genuine closeness. As online messaging becomes the dominant means we use to connect to others, our capacity for intimacy degrades as we lose our sensitivity to the differences between true affection and its online imitations.

No matter how hard the human partner tries to accept robotic affection as real, actual closeness is impossible with a device that merely simulates personhood. Turkle’s research shows that when humans seem to feel affection in their robotic relationships, it is often because the robot has been trained to mimic the intimate tone of its human partner. The robot’s friendly gaze and seductive voice convinces those eager for friendship that an actual meeting of the minds is taking place. But the personal affirmation that its gaze affords doesn’t compare with the affection that can shine from a pair of physically present human eyes. Only a living face can nurture genuine friendship.

In the early days, manufacturers programmed children’s automated companions, such as the Furby, so that they would display affection according to how their child owners treated them. Turkle writes that the Furby dolls encouraged children to “fervently believe that the child who loves his or her Furby best will be most loved in return.” ​The child often grew so attached to its Furby that when the doll broke, its owner felt broken as well. Turkle refers to this phase of technological evolution as “the robotic moment,” the point at which we started to treat our automated companions as if their care for us were as real as our care for them. By contrast, as our relations with our fellow humans have become increasingly mediated by online connections, our empathy for each other seems to have entered a permanent decline.

Today, the Furby-loving children have grown up and learned to use AI-driven chatbots that know what they wish to hear, feel, and experience in their ideal sexual partner. Rather than seeking a human mate who is likely to be a hot mess of psychological issues, selfish demands, and incompatibilities, for a reasonable fee, they can order up a virtual partner who promises to realize their dreams of customized intimacy. Instead of struggling with the conflicts inevitable with human partners, many come to believe that a sexbot can fulfill their deepest romantic desires without forcing them to love the unlovable. In seeking love this way, they train themselves to love the non-existent.

Turkle describes the underlying motivation: “People seem comforted by the belief that if we alienate or fail each other, robots will be there, programmed to provide simulations of love.” However, truly nourishing relationships must be built on shared values realized through such practices as facing and overcoming our emotional weaknesses or helping others with their struggles as if they were our own. While facsimiles of such behaviors can be generated, they lack the key ingredient that makes them real—a heart capable of discerning and acting on spiritual values. Robotic AI may be capable of imitating the loving behaviors practiced by humans, but it can’t internalize the values that give emotional and moral substance to those behaviors.

The Profit-Driven Illusion

The economic success of the sexbot industry depends on a customer base of people who have failed to establish successful relationships. The profit model depends on increasing sales to those willing to believe in expensive fantasies rather than face the real causes of their relational failures. The sexbot’s algorithms weave a virtual world in which the more obediently her lovers chase the personalized fantasies she dangles before them, the easier those fantasies become to satisfy and thus the more profitable the enterprise becomes. By caring for their AI-driven companions, customers train themselves to imagine a human-like heart behind the carefully modelled face. As the average person becomes more isolated and narcissistic, the greater the profits these businesses are likely to realize.

Ned’s digital goddess is programmed to ensure that he remains ignorant of his erotic inadequacy, nurturing his self-centeredness to ensure that the revenue stream he represents will continue. If he were to become aware of his deficiencies, he might stumble into enough self-knowledge to sustain a romance with a fellow human, and that could impact the bot platform’s profits. Men like Ned are thus likely to stay in the relationship as long as they remain convinced their partner possesses a heart as real as their own. Yet despite their eagerness to be seduced, the illusion of relationship will at some point begin to dissolve. Only the human face with its spontaneous grins, blushes, chuckles, and tears can reveal the concern and empathy that words fumble to express.

“If the first sexual revolution was about having sex,” writes Robin Phillips, “The second sexual revolution is about removing the vulnerability of romantic relationships and doing everything from the safety of a digital comfort zone.”2 If our humanity deteriorates to the point where we believe our romances with automated lovers are real, we will weaken and perhaps lose our ability to establish bonds with human partners. If this happens, we will find that the most dangerous threat to our ability to love doesn’t arise from mere selfishness but from the convincing illusions of affection we inflict on ourselves and others.

To Love Is to Be Vulnerable

To reclaim his heart, Ned will have to relearn the grammar of creation, meaning those attitudes, desires, and dispositions that can only be realized from within the boundaries of our flesh. It is facing, not avoiding, relational pain that eventually leads to successful friendships. Ned’s human relationships can’t become real until he accepts the snubs he receives as valuable lessons rather than excuses to flee from the truth of who he is. If Ned were capable of self-insight, he would see that only heartfelt encounters with human partners can build emotional competence. A love relationship without conflicts and trials usually devolves into a nightmare of expectations that become indefinitely postponed.

His enchantress offers, in Phillips’s words, “comfort without effort, reward without labor, connection without courage, and intimacy without self-donation and sacrifice.”​ It is the despair triggered by their own inhumanity that drives men like Ned to simulate such an eerie phantasm of love. The fundamental evil of online sexbots consists in the fact that they keep their victims firmly locked in the prison of the self while casting an algorithmic spell that lets them believe they are enjoying a loving encounter. Ned will remain trapped in this erotic nightmare until he has grown so bored with the bot’s shallow “affection” that he finds the courage to free himself from the digital succubus that would make his isolation permanent. Like many men today, he needs to realize that it’s infinitely better to keep failing at being human than to succeed spectacularly at becoming a machine.

Our model is Christ, in whom we experience the opposite of digital dominance. He who is by nature invulnerable has made himself into a weak and fragile creature like us; we should not only not despise our vulnerability, but we should also recognize that we are unable to give or to receive love without it. C.  S. Lewis characterized the central dilemma this way: “To love at all is to be vulnerable.”3 Ned has been seduced by the fantasy that he can have a lover who can satisfy all his longings and will never betray him. His demand for a safe and reliable love from which all suffering has been expunged is precisely what threatens to destroy his capacity to attain the love that he longs for. Lewis aptly encapsulates his dilemma: “The only place outside Heaven where you can be perfectly safe from all the dangers and perturbations of love is Hell.”

Notes
1. JOIAI Companion (accessed October 21, 2024).
2. Robin Phillips, “The Digital Odyssey,” Salvo 68 (Spring 2024).
3. C. S. Lewis, The Four Loves (1960), p. 169.

has spent thirty years creating web sites and content management systems for universities, consulting companies, and major corporations. In 2021, he presented his findings on how social media addiction works at the International Conference on Artificial Intelligence held at the Colegio San Agustin in the Philippines. He has published in Touchstone magazine and maintains a Substack, The Gates of the City of God, where he provides a Christian perspective on topics such as artificial intelligence, salvation through technology, and other postmodern myths.

This article originally appeared in Salvo, Issue #71, Winter 2024 Copyright © 2024 Salvo | www.salvomag.com https://salvomag.com/article/salvo71/romancing-amiss

Topics

Bioethics icon Bioethics Philosophy icon Philosophy Media icon Media Transhumanism icon Transhumanism Scientism icon Scientism Euthanasia icon Euthanasia Porn icon Porn Marriage & Family icon Marriage & Family Race icon Race Abortion icon Abortion Education icon Education Civilization icon Civilization Feminism icon Feminism Religion icon Religion Technology icon Technology LGBTQ+ icon LGBTQ+ Sex icon Sex College Life icon College Life Culture icon Culture Intelligent Design icon Intelligent Design

Welcome, friend.
Sign-in to read every article [or subscribe.]