It’s come to my attention that people have forgotten how VITAL America is in our world. People have become so consumed in the ‘progressive’ and culture-based ideology that they’ve forgotten why America is the best country on THIS planet.
People used to be proud of what America stood for, now they resent and mock our beloved country. Of course, you can hear the heckling of the alt-left and cower to it, or you can turn it around and say that simply is not true.
For instance, the 1619 project, which claims America ACTUALLY began when slavery rst began in the New World. Not only is this blatantly false, but their goal is also to change how we see America. They want us to see our country as an evil and vile system that wants to suppress power.
The second that you begin to focus more on the aws America has and not the incredible accomplishments that we have done is the day that America will no longer exist.
We are the country that beat Hitler. We put mankind on the freaking moon. We helped establish the highest ethical standards of how humans are meant to be treated. We have revolutionized the idea of democracy and citizen’s rights.
So before you start hearing about how awful we are and how are past calls for us to completely uproot our present think about it. If our past is really that evil, how have we made such big strides?