Despite what our friends on the Left will lead you to believe, America is far from being a racist nation. One will often hear the argument from leftists that America is inherently racist, even if most racists don’t realize they’re racist, but in reality the claim couldn’t be further from the truth. Throughout the nation’s history, there have been many significant events that undermine the claim that the U.S. is systematically designed to ruin the lives of minorities.
Let’s start with the most obvious example, the Civil War, which was fought to end the immoral practice of slavery in the United States. By the end of the war in 1865, the U.S. had lost 620,000 men to free the slaves. These were mostly white men fighting to give African-americans the freedom they deserved. It ultimately didn’t matter that the South cited states’ rights and the Constitution as a means to justify their slave-based economy. What did matter, however, was that the Abolition Movement convinced many people of the personhood of African slaves, resulting in many American citizens to join the fight for their freedom. Even after the war was fought and won, leaders of the time successfully pushed for the 13th, 14th, and 15th Amendments to empower freed slaves. Each of these amendments aided in ending slavery, giving the freed slaves American citizenship, and recognizing their right to vote.
Sounds odd for an allegedly racist nation.
Though leftists may point to segregation and Jim Crow laws as proof of America’s continued racism, one shouldn’t forget that these policies were also terminated, as they should have been. Since Brown v. Board of Education (1954), it has been illegal to segregate students by their race. Furthermore, the efforts of Martin Luther King Jr. and other heroes led to the Civil Rights Act of 1964 and the Voting Rights Act of 1965, ending legal discrimination and the Jim Crow laws. Many more precedents that favored oppressed minorities were set as a result of Earl Warren’s Court.
Pretty good for a nation that supposedly detests minorities.
Admittedly, anyone who studies American history will tell you that it hasn’t always been a fair nation for minorities. That is indisputable. However, left-wing ideologues and historians always conveniently forget to recognize that although the U.S. was a nation of slavery, Jim Crow, and discrimination, it is also a nation that has made huge strides in eliminating these issues. I believe that a truly racist nation would never even consider abolishing its racist laws. But still, leftists argue that “systemic oppression” holds minorities down, and they will point to statistics that allegedly display racial inequities. What they leave out is the fact that there is no law or policy in place that coerces minorities to be poorer, to commit a higher percentage of crimes, or even prevent them from attaining a college degree. Virtually no one denies that helping minorities is a good thing, but we should also agree that labeling America as a racist country today is doing nothing but harming the fabric of society and belittling the amazing results America has achieved in making itself the greatest country to live in no matter what your skin color is.