Recently, a co-worker gave me a set of religious CDs on the topic of sin and morals. She is trying to convert me, and while that's never going to happen until she can prove there is a god, it gives me an excellent opportunity to check out the materials that Christians think are important in the discussion of religion. After listening to these CDs, I walked away with one truly over-powering observation:
Every shred of Christian morality are essentially Humanist concepts wrapped in the shroud of religion.
Why can't the religious among us see this? Does any rational person think that there was no morality before Jesus? We know that the Romans, Egyptians, Chinese and Aztecs all had written laws/codes of conduct prior to Jesus (or the Old Testament in the cases of the Egyptians and Chinese), so why is it that Christians think they have the right to declare that their religion is the basis of all morality in the world?
It seems like an overly obvious point that morality has changed over the last two thousand years, yet the bible hasn't been updated in that time to reflect those changes. What has changed is the deepening understanding of the world we live in and a much deeper ability to work with and appreciate those that are different than ourselves. These changes, of course, are rooted entirely in humanist concepts and exist fully separated from religious dogma. In fact, we could make the argument that the proponents of religious dogma have stood in the way of advancements in humanism, and subsequently improvements in overall morality.
As an example, look at the current struggles between women and the Catholic Church. The Catholic Church has a long history of enforcing policies that are specifically aimed at leaving women in a lesser role than men (the fact that in 2012, women are not allowed to be priests, and therefore not allowed to be a part of any decision making, is stunning). This new push against birth control (while still standing firm against abortion and sex ed) is just further proof that in the Vatican's view, women should not be allowed to govern their own reproductive health and should exist solely as wives and mothers. As Americans, we know that this position is not a moral one (aside from your pro-life vs pro-choice position) because we have come to accept women as equals over the past few hundred years. As Susan B Anthony was fully aware, women's equality happened because of humanism and not because of religious dogma (in fact, religious dogma was/is the primary hindrance to women's rights).
My question is simple: what is it going to take to get people to finally see this point that is preposterously obvious?