What I find fascinating is the deep belief in Christianity throughout the African-American community. Of all groups of people, you would think Blacks would be one of the most anti-Christian groups. The way Whites brutally treated Blacks for so long makes me wonder why Blacks believe in a religion that's considered to be primarily a White religion. Whites shoved Christianity down the throats of just about every group of people they came in contact with around the world. Using Christ as a justification for imperialism and domination of other races. You would think Blacks, just as well as many other groups, would be resistant to Christianity. I know some groups were, especially Native Americans. Many Natives became Christians but many also clung to their beliefs. What's laughable is that Native American beliefs actually made more sense than Christian beliefs. Its understandable to at least pretend to believe in something for survival and political reasons. Unfortunately it appears that the brainwashing was handed down from generation to generation. It's too bad our history books are afraid to point out that Christianity was used as a tool for European brutality. If kids were taught properly about the history of racism, their understanding might lead to less hatred in society and more empathy.