Icon (Close Menu)

The Ethical Problem With AI Might Be Different Than You Think

In light of the recent position taken by the Roman Catholic Church on the ethics and use of Artificial Intelligence, and the contrasting dearth of conversation in the Episcopal Church about the effects of AI training and use, it seems timely for me to write as a priest and a 10-year veteran of the IT industry.

There is so much to say about the current state of the world that not all of it can be said. The issues of climate change, armed conflict around the world, and the infection of hate in the hearts of so many deserve primary attention. I have seen, as I’m sure you have, that there is a healthy level of work on those topics in Episcopal circles. Recently AI has emerged as a subject of interest in the common conscience. It’s only in a more recent moment that the effects and importance of AI have been elevated to the level of crisis on par with those other current issues.

Research and development of the tools we now call AI have occurred for decades. Even before the first electronic calculating machines were a reality, humans wondered what it would look like for their creations to attain skills on par with humanity. The infamous hoax of the Mechanical Turk built in 1770, a chess-playing machine better than most humans, is just one example. It’s no new thing for humans to want tools that equal or exceed themselves.

What is new is that we can now, for the first time in human history, accomplish this dream. The ethics of this new frontier are dizzyingly unsettled. Academia is struggling to deal with AI-assisted papers submitted by students in place of genuine work. Is AI a research tool, or is it cheating? Deep Fake images and videos allow the imitation of non-consenting persons in myriad ways. Is this creative expression, free speech, or assault? I see it as the latter.

When I mentioned research and development, I was invoking thoughts of scientists working for the sake of discovery with one eye on their work, another on their funding, and a third on their ethics review board. Yet what happens when the work to create human-equivalent tools like AI makes the transition from one of science to one of industry? In short, there is very real potential for extreme harm unless deliberate, universal steps are taken to forestall it. It is like building a mountain in hopes of making money from the foothills.

The truth is that AIs are actually quite limited. Large Language Models (LLMs) like ChatGPT can be so wrong in their responses that a whole genre of online videos has emerged to showcase such mistakes. Image-manipulating Diffusion Models (DMs) like Stable Diffusion are so derivative of the images they are trained on that artists are up in arms about theft of their work. It isn’t all wrong answers and humans with two heads and 17 fingers, though. If it were, we wouldn’t be talking about it. The fact remains that modern AIs are capable of amazingly accurate transformation and generation within specific bounds. If we add to that the intervention of human operators to check and refine their output, the influence of these tools begins to expand exponentially.

Both LLMs and DMs rely on vast quantities of training data and processing power to generate a single output. It takes enormous seed data, which is then ingested into a framework of connections and computation, generally called a model, for these AIs to work. There is a mountain of information and processing power behind every conversational response from an LLM. The same is true of the images coming out of DMs. The outputs are minuscule compared to the effort it takes to produce them. What we see as the results of AI are just the foothills compared to the mountains of the AI.

In that mountain of data and deluge of outputs lies the moral questions that the church is called to notice and answer. What does it mean to take all the digital data from people’s lives and use AI to extract more money from them, to blackmail them, or to put them out of their jobs? What about their consent to having their data used? Who owns the data of our lives?

The primary creators, investors, and purveyors of AIs to the world are not academics but corporations. AIs are making the shift from exploration to capitalist endeavor, which puts the Episcopal Church in a particularly sensitive and important position. We are, writ large, the church of corporate leadership and boardroom regulars. We also have a strong tradition of activism and advocacy for worthy causes.

If the ethical questions about the development, training data, and use of AI are actually those of business, surely the church will be told not to become political about it. More pointedly, AI corporations might rightly claim to have all of the experience of the business world, to which they are entitled, and tell the church not to intrude where it does not belong. Such claims are not only unreasonable but misguided, and the church has a God-given responsibility to ask, loudly and in public, what it means for the dignity of every human being that corporations are making so many millions in profits from uncompensated and unconsented work.

Corporations are building mountains out of things that do not belong to them, to make money off of foothills that are pretty to look at and seem nice enough at face value. They are changing the landscape around us with reckless abandon. The effect on us is already showing itself as similar to building a mountain or open-pit mine. The persons who will be most affected by corporate AI proliferation will be those who cannot defend themselves: the poor, those without relevant education, and the marginalized. Consider this scenario: a company decides to pay a person monthly for adding an app on a computer or smartphone. That app will scrape every bit of data the user creates, whether the minutes devoted to watching a video or how many times the user answers work email after hours, and collect it for an AI training data set.

Who will respond to such an appeal in droves, just to gain the smallest taste of passive income? On the other hand, who will have the privilege of thinking such schemes are silly or that the participants are consenting, so what does it matter? Friend, when was the last time you read an End User License agreement rather than just clicking “I Agree”? There cannot be consent in the current state of digital information ownership when a person is coerced by a system of carefully legal exploitation.

The poor will become digital serfs, farming the land and receiving only a small part back to subsist on.. When our Lord spoke of defending the poor and widows, he was pointing us to the persons who were vulnerable to the whims of the socioeconomic systems around them. Jesus advocated for their just treatment. He urged those who have strength and power to become champions for those who have none.

I am, both theologically and technologically, an idealist. I have faith in the people of the Episcopal Church. We have been entrusted with power and responsibility through our unique position in society. Insofar as our reach extends, we can make a difference right now by standing up for the digital rights of individuals. We can make sure that ethical questions about AI are kept active in boardrooms, c-suites, and churches. We can advocate for limits on AI and people’s control over their lives’ data. When we do, it will be like a cup of water that’s offered to one who cannot draw water. Let us serve each other, and through that, serve Christ.

Fr. Chip is an Episcopal Priest and native of the Midwest currently living in Norfolk, VA. He’s spent more than 20 years searching for a way forward for faithful people in our technology transformed world. He became a Priest after working in the IT Industry, which deeply informs his perspective and current ministry. You’ll find him spending his free time picking locks for sport, reading science fiction, or engaged in various tech related hobbies.

Chip Russell
Chip Russell
The Rev. Chip Russell is an Episcopal priest living in Norfolk, Virginia.

1 COMMENT

  1. Thank you for this. I can’t help but think, however, that this conversation is appealing to the same church that has spent the last several decades uncritically pushing us to adopt exploitative technologies like Facebook and Twitter (and helping massive corporations and their CEOs become offensively wealthy), do away with printed materials in lieu of using devices that are built with exploitative labor practices made with materials acquired in morally reprehensible ways (one of which I’m using to type this comment)… We never seem to talk about technology unless it seems like something “young people” are using and which we feel we need to adopt in order to gain a modicum of relevance. So, thank you and others of your cohort in hoping to raise the issue. I’d like to believe that we might actually take it seriously this time!

LEAVE A REPLY

Please enter your comment!
Please enter your name here

DAILY NEWSLETTER

Get Covenant every weekday:

MOST READ

Most Recent

Personhood Ousted from the Heart of the Cosmos

Sometimes it pays to go back to an early theory, even one that has been found wanting. Sometimes...

Haidt & Rauch: The Constitution of Knowledge

In the second chapter of the first book of Nicomachean Ethics, Aristotle says that “the most governing and...

Structural Change in the Anglican Church of Canada

I hate to draw attention, yet again, to the reality that the Anglican Church of Canada is in...

Advent Invites Us to a Better Eschatology

Ours is a world with no shortage of injustice. All too often Christian churches have had a hand...