For decades now, Christian thinkers have been describing Europe and North America as “post-Christian.” It was accurate for a while, but the time has come to retire the term.
A word like “post-Christian” was never destined to last long in the first place. Great cultures aren’t known by what they used to be, but by what they are. (Who calls medieval Europe post-pagan?) It made sense, for a few decades, to describe Western culture in terms of the Christianity it was leaving behind. But now a new faith has swept the old one totally aside.
It is a faith of new gods — millions of them. It’s polytheistic, in a sense, except where the Bible (Romans 1:23) speaks of the people making idols in their own image, this religion goes a step further: we ourselves are the gods.
Acting Like Gods
What else besides a claim of godhood is going on, after all, when a man declares himself a woman, insists that his new sex (“gender”) is reality, not only for himself but for everyone else? What else explains his demand that everyone kneel in obedience to the new reality he has created?
“When men choose not to believe in God, they do not thereafter believe in nothing, they then become capable of believing in anything.” - G.K. Chesterton
How else do we make sense of doctors and judges claiming authority to decide whose life is worth living? These are just two examples of many, where men and women are claiming the rights, powers and privileges that only a god can have.
This article continues at [Stream.org] Post-Christian No More: The Western World Has Its Own Gods Now