Is Christianity a Western Religion? 3:Isn’t the West the “Cradle”?

Is Christianity a Western religion? After all, isn’t the West the “cradle” of Christianity? Isn’t America, the premiere child of Western culture, the “cradle” of Christianity?

The notion that America is the “cradle” of Christianity is still very recent in history, and it may soon no longer be true—not to mention that subjectivity reigns in the answer to whether or not America is the current “capital” of Christianity.
Continue reading