I'm arguing with someone about societal change 101 and I really don't get this whole "You shouldn't be mean to oppressors, change will come by itself eventually" because honestly when has that ever been the case? WWII wasn't won by the allies sending a nice but firm letter to Hitler, LGBTQ+ didn't really kick of until the Stonewall riots, Black people weren't taking seriously in the U.S until people like Malcom X and Martin Luther King stood up for themselves and pushed for change