Good Tech, Bad Tech

Moral Principles for Discerning Good Uses from Bad

The Human Genome Project, an international scientific effort to produce the first-ever sequence of the entire human genome, cost over $2 billion in current dollars and took almost 13 years (1990–2003). Today, personal genome sequencing can cost as little as a few hundred dollars and can be done in a few days, the improvements made possible by advances in technology. The economics of gene sequencing matters because personalized gene sequencing can facilitate customized medical therapies—cancer treatments are a prime example—that yield better outcomes.

Many technologies have myriad applications, and AI is widely expected to improve medical diagnoses and treatments. Computational medicine is making significant strides and mostly in ways that are beneficial to human flourishing. I call attention to all this merely to highlight the fact that, though many of us have the creeping sense that we are being monitored, tracked, banned, seduced, distracted, and generally robbed of our human agency by technology, that is not the full story. Our worst suspicions regarding the malevolent effects of certain applications of technology are valid, but there are also humane and moral pursuits toward which technology is being directed.

What I want to explore here is less about how to fend off unwanted effects and more about how to construct a general framework for morally differentiating between “good” technology and “bad.” How do we avoid uncritical acceptance of something on the one hand and undiscriminating rejection on the other? What are the inherent attributes of a particular application that make it something we should or should not pursue? As someone who is not just a consumer of technology but also an inventor of it, I want to develop a more structured, less reactive moral calculus regarding technology in general, something explicit that is less dependent on intuition and that can inform my own inventive work.

Enamored of Evil Knowledge

The Judeo-Christian origin story explains the tragic circumstances of our existence as the fallout of the first couple’s pursuit of knowledge that exceeded their moral capacities. The very context of the Christian gospel sets it as a remedy for the downstream consequences of that first act of disobedience. It is noteworthy that the immediate effect on the first human beings for having acquired this forbidden knowledge was an explosion of self-consciousness. Prior to their eating from the tree of the knowledge of good and evil, self-consciousness was so absent from their lives, they were unaware even of their own nakedness. But as soon as the forbidden knowledge was acquired, there followed a veritable avalanche of pathologies that placed the self at the center of human interests. Beginning with blame-shifting, followed by self-absorption, and culminating in self-worship, the history of human behavior thenceforth revolved around satisfying, protecting, and promoting the self.

A Poor Center

In our current moment, at the peak of a cultural obsession with empathy, the self has metastasized to the point that some of the most toxic pathologies exhibited by the poor are the result, not of economics nor injustice but of individuals having never developed genuine interests in anything outside of themselves. In Life at the Bottom: The Worldview that Makes the Underclass, Theodore Dalrymplewrites,

In the schools, young children are no longer taught in whole classes but in little groups. It is hoped that they will learn by discovery and play. There is no blackboard and no rote learning. Perhaps the method of teaching by turning everything into a game can work when the teacher is talented and the children are already socialized to learn; but when, as is usually the case, neither of these conditions obtains, the results are disastrous, not just in the short term but probably forever.

The children themselves eventually come to know that something is wrong, even if they are not able to articulate their knowledge. Of the generations of children who grew up with these pedagogical methods, it is striking how many of the most intelligent among them sense by their early twenties that something is missing from their lives. They don’t know what it is, and they ask me what it could be. I quote them Francis Bacon: “It is a poor centre of a man’s actions, himselfe.” They ask me what I mean, and I reply that they have no interests outside themselves, that their world is as small as the day they entered it, and that their horizons have not expanded in the least.

“But how do we get interested in something?” they ask.

In essential ways, being able to function in the world comes as we learn to suppress our tendency toward devotion only to ourselves and instead to develop genuine interests in the larger world around us. A profound lack of interest in anything beyond oneself, or habitually perceiving the world only through the lens of the way we feel, is, as Jordan Peterson has put it, “the best pathway to misery.”

If this is so, then perhaps we can articulate a moral principle regarding technology that takes this into account. Something like:

Any technology that serves to compound self-absorption or to amplify the self-regard to which people are inherently prone is harmful.

If we establish that technologies fueling the monstrous self-regard lurking in every human heart serve evil, what principles might follow that would, by contrast, establish a positive moral framework for discerning the technologically good? We have a little more to unpack before we get there.

A Wider Lens

For much of my life I read the New Testament Gospels through a kind of bifocal lens that was part moral-therapeutic and part psycho-therapeutic. I saw what I was reading as primarily concerned with my moral and psychological rehabilitation and well-being. But something I could never really avoid seeing in the text, but also never really understood, was the background echo of a larger cosmic conflict playing out, one for which humanity was not exactly tangential but was nevertheless less active than my self-regard had led me to assume.

This cosmic clash seemed equal parts fantastical and puzzling, since its presence in the text clashed with my fallen inclination to suppose that everything in the world revolves around me. I still believe we are a central part of the story, but only in the same sense that prisoners of war are central to the daring plot for their rescue; we may be the beneficiaries of all the action, but we ourselves are neither the heroes nor the main attraction. Seeing ourselves against the backdrop of this cosmic conflict and the goals pursued by the actual combatants may offer some helpful clues in developing a framework for distinguishing between virtuous applications of technology and malevolent ones.

The Rescue

The Gospel writer Luke records an encounter in the wilderness between Jesus and Satan over the kingdoms of the world. The attempt at negotiating doesn’t go well for Satan. Jesus’ response was (I paraphrase), “Pound sand.” But it was what Luke records Jesus doing after Satan’s departure that I found illuminating. Immediately after the contest of wills, Jesus proceeded to the synagogue of his hometown and read aloud the following passage from the prophet Isaiah:

The Spirit of the Lord is on me,
because he has anointed me to proclaim good news to the poor.
He has sent me to proclaim freedom for the prisoners
and recovery of sight for the blind,
to set the oppressed free,
to proclaim the year of the Lord’s favor.

Jesus then sat down and made clear to the congregants that he himself was the one to do these things, and he no-kidding intended to do them. Specifically, Jesus announced that he was going to unwind imprisonment, repair injury, and put a stop to the oppression that characterizes human life. In other words, he intended to recover and expand freedom in the world and to restore the intended functioning of human bodies, which had gone awry. Immediately afterward, Jesus proceeded to a nearby village and rescued a man held captive by a demon. Next, he proceeded to heal many more of sicknesses and injury, while also kicking more demons out of their hosts.

So the sequence of events is as follows: Satan offers to give Jesus the kingdoms of the world in exchange for being worshiped; Jesus says, “No,” and proceeds to announce that henceforth he himself will be setting people free and healing their injuries; he then travels throughout the region, kicking demonic backsides and taking names. The narrative comes across as Jesus demonstrating to Satan that he never had any intention of negotiating for the release of the POWs—he was always going to take them by force (force against Satan, not the POWs).

There’s an interesting aspect to Jesus’ encounter with the first demon. The demon loudly announced Jesus’ identity as “the holy one of God,” whereupon Jesus said, “Be silent!” Readers of English tend to read this as a request, or perhaps a command, for the demon to comply. But the verb form Luke uses is in the passive voice. He does not have Jesus asking or commanding silence from the demon—he has Jesus imposing it. The mere power of his words is such that, without any appeal to the demon’s will, Jesus imposes compliance. Jesus, in effect, forcibly muzzles the demon.

I suggest there are morally loaded agendas reflected in these events. Jesus’ actions enacted, on the ground, the specifics of the passage he had just read. They advanced two related but distinct agendas: the return and expansion of human freedom and the restoration of human bodies to healthy functioning as originally intended. He freed those who were captive or oppressed by demonic powers, and he healed human injuries and diseases.

So perhaps another principle by which to identify beneficial uses of technology might be stated like this:

Technologies that enact true human freedom and that restore the right functioning of the human body for its purpose in the world are morally good and desirable.

Toward What End?

Of course, working out the particulars of any applied technology can be problematic. Many can facilitate the moral goods we have identified above, but those same technologies may also be put to immoral uses. Artificial intelligence can be used to defraud, but it can also be used to heal. The same systems deployed to surveil and censor can be used to expose those who would curtail human freedom. There is a hilarious new AI engine that modifies online images of scantily clad women by adorning them with more modest attire. In many cases, it isn’t the technology itself but the application of it that matters.

Andrew Klavan, pondering the announcement of Neuralink’s first human trial, had this to say:

The first [brain-computer interface] implants will be irresistible because they’ll let blind men see and lame beggars walk or some such useful thing. But pretty soon, people will be popping these buggers into their brains like eating M&M’s until everyone is transformed into a human-machine hybrid with Klaus What’s-his-Face secretly reprogramming us to eat caterpillar grubs.

With his characteristic good-natured humor, Klavan puts his finger on a paradox: many technologies can be used for both good and for ill. The distinction is simply a matter of how they are applied. Perhaps a distinguishing factor, then, is whether such knowledge, or any application of a technology, can ever facilitate the restoration of God’s design. If the only possible uses of it thwart the human telos, it should be avoided.

By What Standard?

And that brings us to the matter of underlying worldview assumptions. It is simply not possible to define a moral framework for assessing technology apart from foundational presuppositions about what human beings are and what we are for. There must be a reference point which stands behind such assumptions and against which any notion of the good can be evaluated and understood. If making blind men see or lame beggars walk are worthy pursuits, we must be prepared to explain why: we are presupposing that eyes were made for seeing and that feet were made for walking. Blindness and lameness, then, are contrary to the good.

If using surgical technology to alter the healthy genitalia of children is wrong, it is not wrong simply because a child cannot give adequate informed consent, but because doing so degrades a human body into something other than, or less than, what it should be. Gender transitioning is thus morally indistinguishable from blinding a child or cutting off one’s own ears. The human body has a form, given for a particular purpose. Accordingly, surgeries that dehumanize patients by undermining their body’s purpose are immoral, since they are elementally destructive rather than restorative. They cause harm rather than healing. Similarly, if the moral foundation for human sexuality is about more than merely gratifying momentary psycho-sexual desires, then pornography, homosexuality, transgenderism, and pedophilia are wrong for a reason.

But, of course, it is only possible to say such things if there is a transcendent standard for the embodied human form against which we can assess alternatives. If there is no telos against which notions of the good can be evaluated, then we are adrift in a swirling sea of nothing more than competing opinions and appetites. As Fyodor Dostoevsky said, “If there is no God, everything is permitted.”

Love & Telos

In principle then, only those technologies which in some way further the telos of human existence can be considered moral. Those that thwart it cannot. The second greatest commandment, Jesus said, is to love your neighbor as yourself. This mission can only be pursued through moral uses and applications of technology.

works as a senior fellow at a major semiconductor manufacturer, where he does advanced software research. He worked in technology startups for over 20 years and for a while was a principal engineer at amazon.com. He is a member of Lake Ridge Bible Church in a suburb of Dallas, Texas.

This article originally appeared in Salvo, Issue #71, Winter 2024 Copyright © 2024 Salvo | www.salvomag.com https://salvomag.com/article/salvo71/good-tech-bad-tech

Topics

Bioethics icon Bioethics Philosophy icon Philosophy Media icon Media Transhumanism icon Transhumanism Scientism icon Scientism Euthanasia icon Euthanasia Porn icon Porn Marriage & Family icon Marriage & Family Race icon Race Abortion icon Abortion Education icon Education Civilization icon Civilization Feminism icon Feminism Religion icon Religion Technology icon Technology LGBTQ+ icon LGBTQ+ Sex icon Sex College Life icon College Life Culture icon Culture Intelligent Design icon Intelligent Design

Welcome, friend.
Sign-in to read every article [or subscribe.]