Thy kingdom gone Why is Christianity declining in the West? Not sure, but it's not declining fast enough for me Science and reason are triumphing over ignorance and superstition Its homophobia and hate can no longer be tolerated Multiculturalism -- the influx of immigrants of different religions has shown Christianity isn't the only way to live Not to be too Darwinian, but Western Christians' small families puts them at a competitive disavantage with Muslims The secularists have been militant and successful at forcing expressions of Christian faith out of the public square Christians are so seduced by philosophical materialism that, despite their professions of faith, they live as practical atheists Most churches have so watered down the biblical message, "Christianity" is indistinguishable from the dominant culture Christians have underestimated the power of sin and their duty to shun, resist and overcome it Christians fell for the culture's focus on self-fulfillment and reduced the Gospel to a program of individual self-improvement The breakdown of the family and its replacement with the welfare state In their pursuit of personal peace and affluence, Western Christians forgot God -- now He is forgetting them God is waiting for Christians to humble themselves, pray, turn away from wickedness and turn toward Him before he will heal the West Other View Results Loading ...