Before we dive into who should consider themselves a feminist and why, we must first head into some history.
Though the official American feminist movement only began in 1848, women have been striving for equality with men since the beginning of recorded time. In Ancient Greece, women were active in all parts of government, received the same education as men, and worked in the same fields as men. Additionally, many gods worshiped by Ancient Greeks were portrayed as feminists, such as Artemis.
However, in other parts of Europe, it was the societal norm that women were not treated very fairly until the late nineteenth century. These ideals traveled to the North American colonies and spread over the new United States. In striking contrast to the Greeks, almost 85 percent of the United States worshipped a single, traditionally male god, and the idea of a female deity would have been laughable.
In 1858, Elizabeth Blackwell became the first woman to attend college in America—even though Harvard University (the first ever university) was founded in 1636. This is not to say that women didn’t do their fair share during that time—they worked just as hard as men for half as much pay.
To say that the problems of gender inequality are even close to being solved today is simply untrue. The wage gap still exists, and sexual assault and harassment are still huge issues for women. The word “feminist” does carry a lot of weight; is frequently associated with radical opinions about equality, and feminists are often portrayed as hysterical women raging against all men. It’s true that some ideas of feminism are too focused on women’s issues. As a movement, feminism can seem to demonize men and blame all issues on the patriarchy. However, feminism isn’t inherently bad. When we hear the word “feminist,” we must remember that although people may misuse the label, the purpose of feminism is solely to fight for equal rights—not to put any gender above another.
In truth, it’s your choice. Labels cannot tell the entire story.