The American nation was founded by people who were Christian in a big way. True, lots of evil people have used the nation over the years for unChristian activity. But the base was Christian. You can find all kinds of Christian sayings and indicators throughout Washington, D.C.
The point is that Christianity will be found in the books and school teachings simply because that's where the nation came from, that is how it was settled originally.
It was settled as a Christian society by England before its Independence. The U.S. inherited that Christian foundation, as many of the architects of the Declaration of Independence and the Bill of Rights (7 years later) were religious, in their public life, but non practicing in their private affairs. (For example, our first elected* President, George Washington. He was a dues paying member of over a dozen churches, even a board member on several, but rarely attended a Sunday service while home in Mt. Vernon, Virginia. Yet, on public trips, made a show to attend church services several times a week.)
*President of the Continental Congress was a largely ceremonial position. Therefore, most (all) historians begin the first presidency, with George Washington, in 1789.
EDIT: In my head, I was mixing up the Bill of Rights with the Treaty of Paris...Bill of Rights was 15 years later, not 7.