It seems most feminist theory is based on the idea that men are, to quote George Carlin, “vain brutal greedy assholes who have just about ruined this planet.”
Which is true.
So, one school of feminism says why can’t women also ruin the planet in the name of equality?
Well, fuck you too.
The other is that patriarchal society, which is apparently every society except for like two in the sub Sahara where the people dying from mosquito bites really have their shit figured out, fosters a culture of oppression against women at the hands of men.
A few questions arise:
Since women have been alive roughly as long as men, wouldn’t this collectively be their fault, and isn’t that sentiment inherently demeaning?
Furthermore, if this is the way that every society, ever, has turned out, with men dominating industry and providing food and housing or whatever and women staying home and complaining about it, wouldn’t your life more or less be a natural chain of events?
Clearly humanity is not a grand conspiracy.
If your dad wasn’t catcalling your mom, for example, would you even be alive to complain about it?
Does your problem lie in sexism or an existential crisis which is cluing into the gender issue as opposed to the entire meaning of life, which I’ll admit, as someone who advocates for equality, is perplexing.
Can you have it both ways?
Is it possible you hate yourself?
Keep in mind half your chromosomes are dudes without ponytails.
You’re made up of the people you hate.
What a fucking conundrum.
Follow me on Twitter @Matthewralston