Has there ever been a time in the history of feminism when the word
didn't evoke such a strong negative response from people who are either ignorant of the movement or would rather stick with the status quo?
100 years ago or so, it was a fight to get women the right to vote. There were plenty of people who had disdain for feminists back then, thought they were man-haters, thought they wanted "special" rights, and believed women have equal rights already.
50 years ago or so, it was a fight to get women into the workplace, into political office, to have the birth control pill available, and to remove the stigma of the working mother. There were plenty of people who had disdain for feminists back then, thought they were man-haters, thought they wanted "special" rights, and believed women have equal rights already.
As far as I'm concerned, it's the same old, same old. I'm willing to be thought of negatively if it means my daughter can have equal pay for equal work, have full reproductive rights and access to contraception and plan B wherever she chooses to live, and can see the very real possibility of watching a woman be elected for President (or if SHE would like to run someday).
If I'm branded negatively now, but my efforts help to put an end to rape culture, I'll take it. I think it's more than worth it. That's just me, though. If y'all want to make feminism sound attractive, however, go for it. Doesn't bother me.