Christian Worldview Has Changed

A few weeks ago I ran across a study that was conducted by The Barna Group regarding the changes in worldview among professing Christians in the United States.  I was blown away by the results.  I knew that the watering down of our Christian faith has had significant effect, but I had no idea it was this diluted.

We are becoming a people who has no understanding of our faith.  We have forsaken the truth and each generation is getting less and less.  In our postmodern culture where truth is relative and defined by the individual, our churches are becoming irrelevant.

Read The Barna Group’s report and tell me what you think.  Am I reading more into this than I probably should?  What do you think?

2 thoughts on “Christian Worldview Has Changed

  1. I recommend reading the book, unChristian. It goes into more detail with the problem of Christianity in our country and how it has gotten distorted. Really opens your eyes.

Leave a Reply

Your email address will not be published. Required fields are marked *