Western Values
Some of those who have been dubbed Culture Christians (sometimes even by themselves), make the case that we should not dismiss Christianity, because if we do so, we also dismiss all of the values that it has given us. You might not believe in the vivid stories of the old testament, or even that Jesus was resurrected, but you should still believe in the Christian values of love and forgiveness, and all of the other ethics of the bible. And in a way it seems logical, why reinvent the wheel, if we have a value set that already works? But try to ask one of those Culture Christians if we should then not also look for value in Hinduism, Buddhism or Zoroastrianism? A few of the brave would reply that Christianity is superior, as it is the foundation of the West. That our Western values, rules and culture is all based on Christian values. Well, except for maybe democracy, evolutionary science, the right to abortion, and less important stuff like that. Capitalism doesn't strike me as very Christian either, and Jesus seemed to really not like the idea of free divorce. "But Christianity is the compass that guides you, without that, you would be lost!" So we should follow a compass that we don't really believe actually works? Also, if things like freedom of speech, women's rights etc. are indeed Christian, why did it take ~2000 years for the Christians to realize it?
Why assume that Western values are actually a consistent homogeneous thing, and that they are fundamentally Christian, instead of assuming that Western and Christian values are a symbiosis, that the ideas and thinking of people have shaped both. That rationality has shaped how we interpret the bible, and ideas from the bible has inspired our reasoning?
---
Thought of the day:
If Western values are derived from Christianity, is wokeness then fundamentally Christian? Would Jesus have been woke?