Can Christianity Make a Comeback?

America may have never been a “Christian nation,” but this is for sure: For most of this country’s history, Christianity has been a dominant cultural and social force. From the engine behind social service outreaches to the founding of educational institutions, hospitals and more, the Christian faith has made an indelible mark on our society—at least until recently.

To read the rest of this article, log in or subscribe:

Premium Access

Unlock magazine articles and content downloads

Register Get 5 Free Premium Views
Get Unlimited Access

Magazine Subscribers and Existing Users