Church Anew

View Original

Chat GPT as a Preaching Resource: Everything and Nothing All At Once

Photo by Growtika on Unsplash


As a tool for generating ideas, ChatGPT and other AI offer us the world. Seemingly trained on the entirety of the internet, AI-generated content presents as authoritative and confident. All of the major AI platforms seem to be able to generate a compelling sermon manuscript, plan a Sunday School class, or write a church newsletter. Yet as a tool for integrating ideas within a broader body of knowledge, AI tools offer very little. Lacking the ability to cite sources or contextualize ideas, AI-generated content is error-prone and unreliable. ChatGPT, for example, is notably inaccurate at basic arithmetic, and has been prone to inventing historical facts, locating cities in the wrong country, and providing dubious medical advice. AI generated content, then, seems to be somewhat force-fed to us, without any incentive to contextualize or criticize. Author E.M. Forster unknowingly foreshadowed the impact of artificial intelligence on our intellectual capabilities when he remarked, “Spoon-feeding in the long run teaches us nothing but the shape of the spoon.” This is the paradox of artificial intelligence: that it is simultaneously authoritative and fabricated, compelling and delusional, everything and nothing all at once. 

The preacher has a moral responsibility to wade through these contradictions in learning how to use these emerging technologies. We are rapidly heading towards a culture where artificial intelligence will be a normative aspect in thinking, learning, and writing. As tools like ChatGPT and Google Gemini permeate knowledge creation and sharing, they also will infiltrate the proclamation of God’s Word. Artificial intelligence is already changing the process of sermon preparation. With the development of tools like ChatGPT, the process of crafting a sermon has become more efficient and accessible. Through the simple input of a passage from scripture, AI can generate a compelling sermon that has the potential to engage and inspire listeners. This technology has become a valuable resource for those who may lack the time or expertise to develop sermons from scratch, offering a helping hand to the resource-constrained pastor or parishioner. When provided additional context and details about an intended audience, artificial intelligence can tailor a sermon to resonate more deeply with the needs of a specific ministry context. By understanding the theological nuances and preferences of the listeners, AI can even craft a sermon that not only conveys the message effectively but also aligns with the beliefs and values of the audience.

And therein lies the dilemma. How do we account for a tool that seems so effective at accomplishing creative tasks, yet so opaque in its process? How do we trust a tool whose “thinking” is remarkably cogent, yet whose thoughts are so unknowable? As today’s preacher considers how to use these resources to preach the Word, they need to think critically about how AI generates its content. The church leader may use AI as a co-pilot in the creative process, yet a co-pilot that is treated with an intense degree of skepticism. Ultimately, it is necessary to view AI-generated ideas as the starting point for inquiry and not as the finished product for a sermon or any other faith-related content. 

One might begin to think through this question by reflecting on what AI models have in their toolkit, particularly when prompted to address matters of the church or the life of faith. Large language models like ChatGPT generate media outputs through an impressive process of prediction. By scouring the internet, the tool assigns myriad probabilities that one word will follow another in addressing a specific query. This process of statistical inference is behind all AI-based chatbots (for an approachable explanation of this technology, I recommend Cal Newport’s recent article on the mind behind ChatGPT). The internet is full of free, text-based faith resources. By extension, AI has already been trained on the scriptures, the works of major theologians, ancient doctrine and contemporary hermeneutics. But it has also been trained on content that is generated on social media and digital video sites, the works of lay influencers and celebrity pastors, church marketers and digital evangelists. It has been influenced by the writings of Christian reformers and Christian nationalists, the speeches of denominational leaders and fringe thinkers. When the preacher prompts AI for assistance in the sermon creation process, it’s impossible to recognize whether it is drawing upon Thomas Moore or Twitter, Ignatius or Instagram, core doctrines of the church or the fleeting whimsies of a would-be influencer. 

Yet by the start of the next decade, tools like ChatGPT (or a similar resource) will be our most widely-used preaching resource. These tools have the potential of being more widely used than our lectionary resources, commentaries, and volumes of theology. They will be more widely used than today’s Google search engine. As congregations strain under limited staff and financial resources, time-crunched pastors might be incentivized to outsource some of their sermon writing to technology. These tools will be too fast and too easy to ignore. 

This creates a certain urgency in today’s moment of technological development. Yes, we should be learning to use AI in the sermon creation process. But unless we learn to reflect upon, refine, and scrutinize the output of these resources, our proclamation will be unwittingly influenced by an invisible cloud of unknowable voices. Our tradition deserves better. So use AI to develop your sermon ideas. But resist the urge to cut and paste. For there are voices at work within the algorithm with vastly different religious agendas and theological imaginations. It is up to us to debate whatever AI generates, rather than to act as a doormat. It is up to us to contrast AI-generated content against the canon of our tradition. It is up to us to be faithful and prayerful and judicious in using AI as an assistant, but an assistant we treat with vigilance and skepticism.


See this content in the original post