Earlier this week, I wrote a blog post about CHAT-GPT. The main goal of the post was to show some legitimate ways to use CHAT-GPT in church while addressing what I believe is its main danger: it could write your entire sermon for you.
Surprisingly, the biggest pushback I received wasn't from the Luddites saying, "I don't want to use CHAT-GPT because it's dangerous." It came from people who took umbrage with the idea that CHAT-GPT shouldn't be used for sermon prep.
Before I delve further into this topic, I want to clarify my stance:
I overstated my point about not using CHAT-GPT for sermon prep.
I am not against using CHAT-GPT for sermon prep. I just believe there is an inherent danger in it that doesn't exist with traditional tools.
So let me take a mulligan here and provide a more thought-out and nuanced philosophy on using CHAT-GPT for sermon prep.
One more caveat: This is just one guy's opinion. I am no expert. I happen to be a super technical person who also writes and delivers three sermons every week. I was an early AI adopter and have been using AI to assist with my programming for years. I'm involved in both preaching and training preachers, so I've thought a lot about this. However, what follows are just my opinions, which you can take or leave.
General Guidelines
Plagiarism is still plagiarism, even if you are plagiarizing a computer.
I think the biggest danger in using CHAT-GPT for sermon preparation is that CHAT-GPT is designed to be plagiarized. You can give it a prompt like:
Write me a 3000-word sermon from an evangelical Baptist perspective on Ruth chapter 1.
And in thirty seconds, you'll have something decent and unique.
This is amazing! It's also really good. Notice in that screenshot:
- The logical breakdown of the text.
- The biblical reference to the time period.
- The reference to Bethlehem being "the house of bread."
- The reference to the irony of Elimelech's name.
- The application of compromise.
The rest is just as good.
Were I to write a sermon on Ruth 1 (and I have several times), I would do everything CHAT-GPT is doing here. The difference is, I would have gleaned the outline from reading the text repeatedly and gleaned the rest through studying language tools and reading commentaries. But in seconds, with no work, I got a passable sermon.
If I were feeling particularly lazy, I could slightly edit this and preach it to my church, and no one would know.
Further, no one could go online and find that I ripped it off from John MacArthur or Alistair Begg because it was uniquely generated for me.
It is the ultimate cheat code, which is what makes it so dangerous.
I am against plagiarism in sermon writing. I think it is fundamentally dishonest to pass off someone else's work as my own (even if that person is employed by you for that purpose).
Were you to read my personal notes (not my manuscript, but my study notes), you would find that every time I preach, I put my sources at the top of the page. If I use something from someone else in my message, I give them credit, even if my congregation has no idea who they are.
How do you give proper credit when substantial parts of your message were generated by a nameless, faceless, soulless computer? You cannot and do not have to, and that scares me to death.
So I would say that my first rule for using AI in sermon prep is this:
Do not copy from Artificial Intelligence in a way that it would be wrong to copy from another writer or preacher.
The process matters as much as the destination.
My favorite book on preaching is Haddon Robinson's Biblical Preaching. One of the points the book makes is that people aren't really listening to a sermon; they are listening to a preacher. The sermon has to affect you before it can affect your audience.
If that is true, then the process matters.
I could, if I wanted to, go online and find any number of excellent sermons on any text in Scripture. Indeed, I have a library full of them. So why do I spend most of my week reading, thinking, and writing? Why not just give people what some other (probably more eloquent) preacher said? Why reinvent the wheel every week? Because doing so would shortcut the process.
I smoke a lot of meat. Tonight, I will likely put a pork butt on the smoker for a fourteen-hour, low and slow cook. The goal is not to get that pork to 205 degrees as quickly as possible. There are many ways I could do that. The magic is in letting the process do its work on the pork over time. The process matters.
CHAT-GPT is like a magical microwave that can produce whole sermons in seconds. I don't even have to pore through books and online sermons—it can write my whole sermon for me. If I don't like what it writes, I can ask it to do it again and again until I find something I do like. But it shortcuts the process, and the process is where the magic happens.
Wouldn't they feel cheated knowing that I microwaved something instead of letting the Scripture work on me all week? Doesn't that cheat my own sanctification?
So my second rule for using AI in sermon preparation is:
Do not use Artificial Intelligence in sermon preparation in a way that shortcuts the process of study and meditation on God's word that people expect from a faithful preacher.
Fact-check: true
I remember hearing a preacher once tell a story about a man trapped in a train car who froze to death even though the train car was 55 degrees. The story was told very well and was a perfect illustration of the speaker's point—that what we think about and worry about has a profound effect on us. The only problem is the story absolutely never happened.
The internet has completely ruined a lot of "good" illustrations. People in the pew have smartphones and can check the veracity of everything we say. If we exaggerate and pass on as true some fact that isn't right, what does that say about the Scripture we are teaching and applying? If we are gullible enough to believe in things that obviously didn't happen, what does that say about our belief in Scripture?
When we preach (or when we prepare to preach), we need to constantly ask the questions, "Is this true?" and "Can I verify this?" and cut out the things that aren't.
As I have used AI for programming, one of the things I've noticed is that sometimes it just makes stuff up. There have been many times it has suggested a function or method to me that does not exist.
In programming, that's pretty apparent right away. You try the code. It tells you it doesn't work. You go back to the drawing board. But what about in preaching?
What if CHAT-GPT or some other AI tool makes up a story that didn't happen or shares a Bible reference that doesn't exist? Are you, as a preacher, going to pass that on to your people? Will you even know?
So a third rule I have for using AI in sermon prep is this:
Never use something from Artificial Intelligence that you have not verified.
Profitable Uses
So what does that leave us? Can we still find profitable uses of CHAT-GPT and other AI tools that don't amount to plagiarism, shortcutting the process, or sharing things that are questionable? I think you can.
With those guidelines, here are just some of the ways to use CHAT-GPT in sermon preparation that I do not think are cheating:
Use AI as a thesaurus or dictionary
The way I use AI in my sermon writing more than anything is as a thesaurus or dictionary.
If you were to search my prompt history, you would find a lot of instances of:
- What are some words that mean...
- What is a word that starts with P that means...
- What are some other ways to say this sentence...
This is just a much more efficient thesaurus or phrasebook, and the answers can be easily verified using a traditional dictionary.
Use AI to look up specific facts
Another way I've used AI in my sermon writing is to look up specific facts. These might be about the history or background of the text, some statistic I am looking for, or some other supporting material I am interested in for illustration.
For instance, the other day I was writing a sermon about Tychicus (from Colossians 4), and I wanted to know how long the journey would have taken from Rome to Colossae around AD 60. So I asked CHAT-GPT and got a very thorough answer. That kind of answer is pretty easy to verify.
Basically, you can use CHAT-GPT in cases where you would be doing a Google Search that would take you to Wikipedia.
Here are some examples:
- How many people live in modern Turkey?
- What percentage of a country's population are fighting-age males?
- How many casualties were there in the American Civil War?
- How would people in 1st century Israel harvest grain, and what kind of grain would they harvest?
Again, you have to verify these answers, but asking CHAT-GPT often sends you in the right direction and saves you a lot of work.
Use AI as a cross-reference tool
You could ask CHAT-GPT for a list of verses from Proverbs about the folly of getting angry too quickly or a list of verses on anger. This wouldn't be much different than doing a word search on BlueLetterBible or a topic search on Got Answers. You could easily verify each reference with a Bible.
I've found that, for my purposes, this isn't as helpful as just using my favorite reference tool: The Treasury of Scripture Knowledge. But I don't think it is cheating.
Conclusion
I live in Amish country. I cannot go to the local Walmart or Aldi without bumping into Amish families. There are a lot of misconceptions about Amish people and their relationship with technology. They actually are not against technology and innovation - they are for thinking carefully about the unintended effects of that innovation on what they hold most dear: their faith, their family and their community.
New technologies have a way of destroying as much as they create and it behooves us to think about them carefully. Artificial Intelligence will prove to be one of the most disruptive technologies ever invented. We as preachers have to think carefully about how we use it and don't use it. While recognizing the tremendous utility of Artificial Intelligence as preachers, we also have to recognize the dangers, and we need set clear ethical guidelines.
Consider this post my humble attempt to start that conversation.