Is Christianity a Western religion? How has the West treated Christianity?
Christianity was terribly persecuted by the West before it was accepted by the West. And even then, the West accepted and would continue to accept Christianity partially on its own terms.

That may not strike us as a significant question, may not register as something necessary to establish as true or false. But there are two reasons why it is important to dispel the illusion that Christianity is a “Western” religion.