Is Christianity a Western religion? After all, isn’t the West the “cradle” of Christianity? Isn’t America, the premiere child of Western culture, the “cradle” of Christianity?
The notion that America is the “cradle” of Christianity is still very recent in history, and it may soon no longer be true—not to mention that subjectivity reigns in the answer to whether or not America is the current “capital” of Christianity.
Continue reading
That may not strike us as a significant question, may not register as something necessary to establish as true or false. But there are two reasons why it is important to dispel the illusion that Christianity is a “Western” religion.
What is a Christian T-Shirt?
“All men are created equal.”