It doesn’t seem like this was always the case - obviously there’s a lot of myth making about the “founding fathers”, but it does seem that a lot of them were genuine Enlightenment men.
I’m not under any illusions that the USA was ever a secular nation, but it seems like the phenomenon we see now, of right wingers marrying America = Christianity, Christianity = America, in their worldview, wasn’t always there.
Is it just the result of Cold War propaganda, juxtaposing the American empire of Christendom with the evil atheist soviets?
Insane settler-colonial millennarian evangelical sects are as American as genocides of indigenous peoples