A recent string of Leftist tirades has lumped ‘white-conservative-Christians’ with white supremacists, stating that white Christians, especially fundamental and orthodox Christian groups, are racist. The Left-liberal ranters also place the growing gun violence, terrorist attacks, and all other problems facing the Nation on conservative Christians holding firm to Biblical teachings. This clearly is an absurd string of blame shifting. Fear and unrest in Nations have historically been linked to individuals straying from Biblical teachings such as the Ten Commandments. Which raises the question, should America remain a Christian Nation?
Is America’s fate tied to following the Christian God and His law? Let us know, and share with everyone so we can get the truth of this matter out and stop the ignorant rants dividing this country!