SkeeK
The Original, Rock n' Roll Doggie, VIP PASS
No, this isn't a troll or anything, it's the topic of a seminar I'm doing for History class. Basically I just want to offer this up for discussion.. do you think it's true? I know this could get rowdy so please keep it polite and such.
What this is talking about is the fact that less people now believe in god, especially the christian god, than did before, and also that the church has much less power than it once did. Of course it's hard to define 'the west' but i think of it as modernization: the rise of rationality, industrialization, capitalism, individualism, and deruralization (if that's a word. etc)
So do you agree that religion is less important to society than it was 20, 100, 200, or 500 years ago? Why might this be? etc. etc.
And if you know of any films or anything that would be really good to show, or anything like that, then feel free to suggest some.
What this is talking about is the fact that less people now believe in god, especially the christian god, than did before, and also that the church has much less power than it once did. Of course it's hard to define 'the west' but i think of it as modernization: the rise of rationality, industrialization, capitalism, individualism, and deruralization (if that's a word. etc)
So do you agree that religion is less important to society than it was 20, 100, 200, or 500 years ago? Why might this be? etc. etc.
And if you know of any films or anything that would be really good to show, or anything like that, then feel free to suggest some.