Posted BY: Brent Ramsey

Today telling the truth has largely disappeared from public discourse.  Let’s use basic history as an example.  Learning history in school was the norm during the period 1952-1969, the formative years up through my graduation from college.  I received a detailed education in public schools of our founding principles, American history, and world history dating back to antiquity.  As part of a Navy family, I went to school all over the country: Virginia, Maryland twice, California, Illinois, and Nebraska.  History teaching was consistent everywhere, including hard truths about slavery here and around the world.  After college, I continued studying history in depth, including the Revolutionary period up to our founding, the Civil War period and Lincoln, and the major world wars in the 20th century, especially WWII. 

Trending: With So Many Strange Things Happening, Is It Time To Stock Up On ‘Emergency Food’?

Studying the Founders was particularly insightful and important in terms of really understanding the true nature of America. The Founders were highly educated and acutely aware of man’s checkered history. They excelled in extraordinarily challenging circumstances in establishing a new nation based on principles of liberty and opportunity. Was our Founding flawed by slavery and other pre-modern holdovers from a more primitive era? It was. But abolishing slavery at that time was impossible considering the South’s dependence on it as an essential element of its economy. So, evil as it was, had it not been accommodated, America would not have been created at all and we would likely still be English.

Full Story