Mar 28, 2020

The Declining Influence of Christianity in the 19th-Century West

1 Min Read

While many Western people remained committed Christians in the 19th century, many of the most influential individuals did not. From his teaching series A Survey of Church History, W. Robert Godfrey examines the impact of this religious decline on our own culture today.


Already at the beginning of the 19th century, there was a sense that Christianity’s future was problematic. In 1799, a still relatively young German pastor gave a series of lectures that were later published with the title “On Religion: Speeches to Its Cultured Despisers.” That’s very interesting, that a young preacher would think Christianity was so under attack by the intellectual elites in Germany, and more broadly in Europe, that he decided to try to address that forthrightly with this series of addresses, “On Religion: Speeches to Its Cultured Despisers.” That points out an important thing that we see as we go along in the 19th and 20th century: that Christianity will remain in many places strong and influential among what we might call more “common people,” but with declining influence amongst the “power brokers”—the intellectual power brokers, the political power brokers. We don’t have to stretch our imaginations very far to understand that, because we see that in our world today, don’t we? Vast sections of America are still religious, are still churchgoing, are still pious, and are still Bible-believing. But the cultural elites in America—those who control the universities, those who control public media—these people, by and large, are not interested in that and are not convinced of the importance of that.