Can Christianity Make a Comeback?

America may have never been a “Christian nation,” but this is for sure: For most of this country’s history, Christianity has been a dominant cultural and social force. From the engine behind social service outreaches to the founding of educational institutions, hospitals and more, the Christian faith has made an indelible mark on our society—at least until recently.

Advertisement