On ominous red-on-black lettering, a recent Newsweek cover carried the headline, “The Decline and Fall of Christian America.” The magazine’s cover story by editor Jon Meacham described what he saw as a failed project by the Right “to engineer a return to what it believed was a Christian America of yore.”
The story immediately provoked a wide array of reactions from across the spectrum. Whether Meacham is ultimately correct in his observation of these trends and his interpretation of their meaning is yet to be seen. The 1966 Time magazine cover that asked “Is God Dead?” could not have foreseen the development of religion in American public life over the past 40 years, and we shouldn’t expect any more from Newsweek. What the latter cover has accomplished is to raise questions vital to both the health of the Christian tradition and for the public discourse of our nation.
The question that struck me in the story was that of the changing role of religion in public life and politics.
The Religious Right was a Christian mistake. It was a movement that sought to implement a “Christian agenda” by tying the faithful to one political option—the right wing of the Republican Party. The politicizing of faith in such a partisan way is always a theological mistake. But the rapid decline of the Religious Right now offers us a new opportunity to rethink the role of faith in American public life.
I AM NOT OFFENDED or alarmed by the notion of a post-Christian America. Christianity was originally and, in my view, always meant to be a minority faith with a countercultural stance, as opposed to the dominant cultural and political force. Notions of a “Christian America” quite frankly haven’t turned out very well.