Has America gone pagan?
Thank Gaia, yes!
Yes, but it's no more or less dangerous than any other religion
Yes, America is post-Christian, therefore by definition pagan
Let's see: Child sacrifice? Check. Rampant sexual perversion? Check. Relativism, irrationalism and superstition preferred over objective truth? Check. I'd say yes
Yes, it went pagan when the church lost her influence several decades ago
No, if it were a big thing, I would know someone who's into Wicca. And I don't
No, this whole withcraft thing is a fad, a trend that will be replaced by something else
No, Christians quit being salt and light and left America up for grabs
No, America is still the most Christian nation in the world
No, believers in God are still the vast majority
ADD THIS POLL TO YOUR SITE (copy the code below)