Seriously. I've been wandering around my town and everywhere I got there's fucking "Women Only" gyms, "Women Only" clothing stores, "Women Only" schools, "Women Only" washrooms, "Women Only" tanning salons, etc. And if a man goes into these women only stores, he is immediately asked to leave the premises. Women also general get the better end of the bargain in the divorce? How the fuck is that fair? Women are essentially treating men how the American society used to treat black people.
Feminism = Matriarchy