When I first thought of feminism, I thought of it as something to be proud of as a girl. But every day I realize that feminism should be something that we Americans should be ashamed of.
Today feminism means to hate men and blame them for all the problems of the world. Feminists blame men for not giving women equal rights, first of all, if you’re waiting for a man to give you equality you don`t deserve equality.
The women`s march is something everyone keeps talking about and are so proud of, but please tell me what did that help women with? it only gave the media more stories to tell to bored people in their car during traffic.
Women ARE equal to men and men are equal to women. And going around and accusing men that did absolutely nothing to you and judge them for making your perfectly normal life to be miserable will not get you anything
Do you want to know how women got the right, freedom, and equality in America: by hard work; the simplest example is World War One and World War Two when men were gone women proved themselves first and everyone else in the world that they CAN do what men do.
It`s hard work! that`s all you need. writing stuff on a big piece of paper and blocking the roads will not change anything.
We women worked hard to get where we are, and creating a word and hating men will not help us get anywhere. We must love each other because men and women live together. We have to work together. We have to respect each other.