Is Christianity a Western religion? That may not strike us as a significant question, may not register as something necessary to establish as true or false. But there are two reasons why it is important to dispel the illusion that Christianity is a “Western” religion.
- Some enemies of Christ will claim that Christianity is a Western religion, a philosophical development of the Western World that only belongs to the Western World, and therefore is not meant for all men. —Some even going as far to say that Western religion is actually inferior to Eastern spirituality because, among other things, its “closed-minded” insistence on monotheism, reliance on certainty of doctrine, and history of cooperation with colonialism. In this way many people who either belong to or fall in love with Eastern cultures have clung to the excuse that “Western religion” is not for them, therefore Christianity is not for them.
- Some Christians will claim that Christianity is a Western religion, and therefore the West is superior to the East, that the West should rule the East, and/or that in general the West is inherently good and the East is inherently evil. —Some also assume that therefore what is Western is therefore what is Christian, and blend the values of their Western surroundings into their own concept of Christianity. In this way many people assume that Western imperialism, colonialism, capitalism, and/or militarism are not only necessary to further the Gospel, but are even “pillars” of the Gospel.
You can see why many people at first believe this. In the past 200-300 years, it has been white missionaries from Europe and America who have gone to the East to bring the Gospel. In the past 200 years, it has been America that has grown to celebrate itself as a “one nation under God”, many of its citizens claiming some sort of special privileged status under God. In the past 300 years the Eastern hemisphere has been greatly harmed by Western colonialism, which had often been accompanied by missionary work. But this is only a superficial reading of Christianity and its role in the world, a superficial view that is increasingly fading in an age of post-colonialism as once colonized nations now shed their colonial ties and we wait to see what happens to the Christian “outposts” that remain. It is now arguable that Christianity is now stronger worldwide than it was during the heavy days of Western colonialism. In the upcoming posts, planned to appear every 2 days, we will explore reasons why the label of “Western religion” is misrepresentative of the Way of Christ, and why it matters.