“Was America Really Founded As a ‘Christian Nation’?” Most Americans believe the U.S. was founded as a Christian nation. But, is this merely a myth? Larry talks with the author of a new book claiming ‘Corporate America’ invented ‘Christian America,’ and how that has defined & divided our politics since.
KCRW’s To the Point with Warren Olney
“One Nation under God…but Since When?”