This is kinda silly, but I am interested as a matter of research for my novel.
Think back on the history of the church and you see many different developments that have forced Christians to question how they have understood the world and their faith. For some this meant walking away from what they believed, for others it meant conforming their beliefs, for others it meant ignoring obvious truths, and for another group it meant trying to see where their interpretation of Scripture, traditions vs. truth, and presuppositions led them to wrongly associate their incorrect ideas with Christianity.
When the heliocentricity of the solar system became pretty much undeniable many people struggled with how this fit into their faith. They had always understood the earth to be the center of the solar system and this supported their beliefs that the earth was the center of God’s creation and the language of the Bible which spoke of the rising and setting of the sun. If the earth were not the center of the universe than they had to revisit these areas of their beliefs.
So what would a zombie apocalypse do to your faith? What would you have to ask yourself about God and the Bible and your beliefs? Which group that I talked about above would you fall into?
Obviously this is just for fun. Feel free to share your thoughts or tell me that this is sacrilegious. It won’t hurt my feelings.