A couple weeks ago, I wrote about what I think “secular” means. Today, I want to explore the concept of our country being “post-Christian.” The idea that America is now “post-Christian” gets thrown around a lot, but what does it actually mean? I think there’s a lot of confusion about our current cultural moment because…
What Does “Post-Christian” Mean Anyways?