In a famous scene in the 1968 film 2001: A Space Odyssey, a tribe of primates gains control of a drinking hole by turning the bone of a deceased animal into a bludgeon. In celebration, one primate throws the bone into the sky. As the bone-turned-club spins triumphantly in the air, the camera then cuts to a similarly shaped satellite flying in space. Director Stanley Kubrick’s meaning is clear: The human quest for new tools to do our bidding has advanced at a startling pace.
And we certainly have come a long way from bone clubs. Earlier this year we entered a new chapter in that ongoing odyssey: widespread public access to generative artificial intelligence, or technology that can recognize and predict patterns to create new material including text, images, code, music, and video. Thanks to tools like ChatGPT or Lumen5, questions about the role of AI in human life are no longer the domain of sci-fi films, but news headlines. People across the country — including striking workers in Hollywood, educators worried about plagiarism, and people who dare to use self-driving cars — are wondering what role AI has in the quotidian aspects of our lives. What the best ways are to interact with such rapidly advancing technology?
For Christians, AI poses its own set of questions: Is it OK to use AI to write a prayer or outline a sermon? Is AI able to provide counseling and spiritual direction for congregants seeking discernment and spiritual care? Can AI achieve personhood, and if so, what should our commitments to caring for them be?
I found some guiding answers in Noreen L. Herzfeld’s book The Artifice of Intelligence: Divine and Human Relationship in a Robotic Age. Herzfeld is the Nicholas and Bernice Reuter professor of science and religion at St. John’s University and the College of St. Benedict. In our conversation about her work, we touched on some of the questions outlined above, dove into the often overlooked ecological and environmental consequences of ChatGPT, and explored what theologian Karl Barth’s works on personhood may have to contribute to our conversations about how to interact with AI.
This conversation has been edited for length and clarity.
Zachary Lee, Sojourners: What was the inspiration for The Artifice of Intelligence ?
Noreen Herzfeld: I was at a conference in Oxford, and the Anglican archbishop of England, who is in charge of their committee on artificial intelligence, asked me if I could recommend a book to help priests, bishops, and interested laypeople get up to speed on conversations about AI, specifically any questions and concerns there may be for Christians. I couldn’t think of one and so I decided to write the book that he had asked for.
Your work explores the possibility of whether humans can have authentic relationships with AI. What does “relationship” mean to you and what kind of relationship between AI and humans are you talking about?
Obviously, there are lots of kinds of relationships. I have a different relationship with my dog than I have with my partner. All our relationships exist on a spectrum, which I honestly think is a better way to talk about AI than personhood; you either are or you aren’t a person … it’s black and white. Thinking of relationships as a spectrum though, we can ask, “So where does AI fall on that?” I decided to use Karl Barth’s criteria for an authentic relationship. He outlines the following as criteria: You can look the other in the eye, you can speak to and hear the other, you aid the other, and you do all of this gladly.
These turned out to be four very good categories with which to examine AI because the first one [about eye contact] presupposes embodiment. Then this asks the question: To what extent does AI need to be embodied? What does it mean to look someone in the eye, be present with them, hear, and speak to them?
These criteria also present questions about agency: How much agency does an AI really have? Can a machine do anything gladly? To do something gladly means that you have to have free will and you’re not coerced. Can a machine experience the emotion of actually being glad? This all points to the question of embodiment. I believe that the importance of embodiment, that importance of the physical, is one of the things that Christianity brings to the table of world religions in a way that other religions do not. I mean, the incarnation, the resurrection, the sacraments … they all deal with physical embodiment and its importance.
Related to that theme of embodiment, you talk about how our relationship with AI has shifted to where we now want them to be our partners instead of just our servants. Can you flesh out the implications of that idea?
One of our big problems is that we want AI to be both a servant and a partner. We want AI’s values and goals to be aligned with our values and goals. One of the big fears that people like Geoffrey Hinton and Elon Musk are talking about right now is that AI won’t be in alignment with human values and that it’ll decide maybe that it doesn’t need us around or that it will get out of control in some way. Well, as soon as you start talking about control, we’re moving from the partner into the servant realm.
I think of what St. Augustine said: “Oh Lord, you have made us for yourself, and our hearts are restless until they rest in you.” What he’s saying in that is that we are hardwired to want to be in relationship with something or someone that is other than human, that is other than us. I think this is part of our impetus for trying to locate extraterrestrial intelligence. It’s part of our impetus for trying to communicate with dolphins and whales. I think in some ways it’s part of the reason why I have a dog snoozing on the floor behind me. As a society, we are becoming less religious and so God no longer fulfills that need for us.
We no longer believe that we live in a world where God is all around us, where angels are around us, where the saints are still with us. It gets lonely without all of that. And so, we reach for the computer to fill that need. The problem is that we run into this dichotomy where we’re looking for an authentic partner and yet, at the same time, we want it to serve us. You can’t have both. Not at the same time.
I’ve been thinking about how, at best, AI can only reflect our biases rather than transcend them. Ironically, in our desire to make something that is transcendent or above us, all we’re able to do is just create something that reflects us.
Exactly. In our creation of AI, we are projecting our own image onto the machine. One of the things that got me studying theology in the first place was that I was interested in why people wanted to make a computer in our image. It seemed to me that computers were the most useful to us precisely when they did the things that we didn’t do well. In other words, when they were complementary to us, rather than mirroring us — when they were crunching the big numbers, dealing with the huge data sets, etc.
In 2021, I spoke with Cornell professor J. Nathan Matias about AI. He described a sentiment that you captured well in your book: Conversations we have about a Terminator-like takeover in the future distract and take away from the very real damage AI is doing now. For example, AI may be trained to criminalize one group because of the biases feeding it a data set.
Exactly. It doesn’t make for as good a movie. One of the things that I’ve been giving talks about recently is AI’s effect on climate change. Climate change is not coming. It’s here. One of the things when we think about AI, we don’t think about the fact that if you do a search with ChatGPT you’re using five to ten times as much energy as doing a search with Google.
Oh, really?
We don’t realize the energy use that goes into computing; it is hidden from us. We think, “Oh it all happens up there in the cloud. It’s all so clean. It all happens in cyberspace.” Well, it doesn’t. It happens in huge data centers that not only are tremendous fossil fuel gobblers, but they take a great deal of water as well — for the cooling of all those servers that are doing our calculations.
So, we think, “Oh my God, the Terminator might be coming down the road.” And I’m thinking, “Oh my God … we’re throwing away our beautiful Earth on these machines.”
You reference other literary works where human beings have attempted to create in our image, like the story of Pygmalion, Mary Shelley’s Frankenstein ,etc. What would you say to people who aren’t as concerned that we may be making AI in our image and that these technological developments are a natural and good part of evolution?
If you go back to the prophets in the Old Testament, they define an idol as anything that you create with your own hands and then bow down to it. It’s so easy to make an idol out of AI, and we do what the prophets warn us about when we put AI in a position where it makes decisions for us and where it’s making decisions about other people’s lives. When we do this, we idolize ourselves because it’s our image that is out there. We’re trying to be God rather than letting God be God.
You know, my feeling is that we need to let God be God. We need to let humans be humans, and we need to let computers be computers and machines be machines. It’s much more tempting to make these category errors of blurring with something like artificial intelligence.
There’s also this question of when we make an image of ourselves, we’re often trying to capture what it is we value the most in ourselves. And right now, in our society, we value rationality and intellect. We value our brains the most and so with AI, we’re trying to isolate and expand that one piece of humanity. We’d like to be disembodied minds because we think that’s a way that we could approach immortality without needing all that God and religion stuff.
Christianity teaches that we are always an embodied whole. And I believe that it is vital that we recognize that one of the problems with trying to separate the intellect from the body is we lose the salience of emotion because emotions happen in our body. When we feel fear, our hearts start racing, our hands get clammy, etc. So, to separate the intellect from the body, which is the seat of our emotions, is to separate the intellect from love. And that’s a very dangerous thing to do.
Embracing Christianity also teaches us to embrace our limitations or finitude. We often don’t want to think about our limits and want our dreams to go unchecked.
The theologian Reinhold Niebuhr said that one of our big impetuses toward sin is precisely this: trying to imagine ourselves as not being limited. We are finite creatures, but we can pause at the infinite, and then we tend to think of ourselves in those terms. And that leads to overreach, hubris, the center of pride, which Niebuhr saw as the root of all of our sinfulness.
I also look back at the patristic fathers in the Eastern Orthodox tradition. And you know, while it says in Romans that the wages of sin are death, in Hebrews it essentially says we sin because we fear death. It's not that sin causes death but our fear of death that causes us to sin. And that’s also what Niebuhr was getting at. So when we forget that we are finite, limited creatures, and that every life has a term, it does tend to lead us into grasping and trying through — our own mechanism — to avoid death, which we ultimately cannot do.
Looking at your work, you’re not anti-technology. At least, I don’t think you’re saying we should take a sledgehammer to our Roombas or throw our phones in the nearest body of water. So what are some tangible ways that we can maybe think about engaging with artificial intelligence in healthy ways?
Firstly, let’s not make category errors. Humans are humans. Living beings are living beings; AI is not – it’s still a machine. We should give to the machine what belongs to the machine; give to humans what belongs to humans, give to God what belongs to God. We should turn the machines off and be in relationship with each other and give humans what we owe one another. Look the other in the eye, speak to and hear the other and do it gladly and not do it all mediated by machines. Because every time it’s mediated, it becomes partial. So technology has its place, but it should never be a substitution.
Got something to say about what you're reading? We value your feedback!