US FASCISM
Let's be clear. Fascism hasn't come to America. Fascism WAS BORN in America. American fascism historically inspired Hitler, was responsible for racial segregation and has never gone away. America exists as it is today because of a race-based fascist genocide and land grab. Fascism is as American as it gets.



