Religious Right groups and their allies insist that the United States was designed to be officially Christian and that our laws should enforce the doctrines of (their version of) Christianity.
Is this viewpoint accurate?
Is there anything in the Constitution that gives special treatment or preference to Christianity?
Did the founders of our government believe this or intend to create a government that gave special recognition to Christianity?
The religious crackpots will try anything to justify their deluded position.
Just look at 1954 when they changed the Pledge of allegiance into a prayer.