I started calling myself a feminist when I was about 16 or 17 years old, and it was just like one of those things where once it was pointed out to me, I started to see shit everywhere. The way I would have to yell to get my point across in debate team in middle school, but then a boy could say the exact same thing and get immediate respect and attention. When I made jokes or messed around, I got yelled at by teachers for “flirting” with the boys. Or the way girls my age, including me, were taught to hate our bodies and starve ourselves while simultaneously being forced to be “up for anything!!!” so you weren’t that girl.
Boys would get praised for the exact same things girls would get yelled at for. They got to be defined by sports or academics, while I got defined based on how many boys liked me. Once this got pointed out to me, I was furious all the time, and feminism was the only thing I was interested in. Presentations on female anarchists, counting up the number of female writers we read in English class and asking why there weren’t more, intense fights with other classmates about how the NFL promotes rape and violence against women. I was all about it.
I am so grateful that I found feminism when I did, because it kind of saved my teenage self-esteem from being next to nothing. Once I realized that I felt shitty because powerful rich people want to make money off of making me feel shitty, I couldn’t let myself do that anymore. Instead of hating my own body, I was able to direct that hate toward the people that made me hate it. It did amazing things for my confidence and I feel so lucky to have found it when I was a teenager.
But I’m also pretty pissed off that I have to feel “lucky to have found” feminism. Women’s studies should be taught in public school. That is one of the most firm beliefs I hold. I’m appalled that it’s not required in college. How can you be expected to talk about literature, film, history without being educated in feminism?
I was an English major, essentially, and the frustration I felt at my school’s curriculum was enough to make me graduate early and get out of there as soon as possible. The fact that we view white history/art as a requirement, while the history and art made by racial minorities is seen as elective or extra was SO FUCKED UP TO ME but no one really seemed to think that? Issues of privilege or diversity were shrugged off in a “well there’s not enough material” kind of way, but I know that’s not true!!!! I absolutely hated that studying literature meant that I had to be “well-versed in the classics” and we had to “know the canon” when all of that was code for white male literature. Work written by men was always seen as more important, more necessary, which spilled over into the entire culture of my school.
And students were not equipped to talk about it. Every literature class where feminism or gender studies was brought up, we had to start back at the basics. I got exhausted having to explain and explain and explain basic women’s studies to students, to usually only get met with “oh well yeah BUT…” because there was absolutely no understanding there. They had nothing to reference, because they were only exposed to feminism as a kind of niche concept that didn’t apply to them.
Sometimes, I’m hopeful. Feminism seems to be moving more into mainstream culture, with Beyonce standing in front of the word “feminist” at the VMAs, and other popular musicians like Nicki Minaj, Taylor Swift and Lorde and Haim being very vocal about feminism and encouraging girls to speak up for themselves. Things have definitely changed, and I think a lot of girls are learning about it probably even younger than I did, and that’s great.
It’s just also kind of horrifying to see what mainstream feminism chooses to focus on. And one of these things is demonizing women for not being feminists.
For mainstream feminism, the things that seem to be most important are:
- every woman (ONLY WOMEN) identify as feminists regardless of their knowledge on the subject
- BUT WE STILL LOVE MEN
I have such an issue with the way we force women and ONLY WOMEN to identify as feminists. I hate how we ask female celebrities about their feminism, but we don’t ask men. If we’re getting celebrity opinions on feminism, why just ask Kaley Cuoco? Let’s ask her male costars, Jim Parsons and some other sad people on Big Bang Theory.
And then, on the flip side, when a man chooses to say he’s a feminist he is PRAISED for it, LAUDED ABOVE ALL OTHERS for being the HERO WE NEED. Aziz Ansari goes on TV and says shit women have been saying for decades, and all the sudden he’s a genius visionary. Louis CK makes jokes about how women should be terrified of men, and everyone acts like it’s never been said before.
THANK YOU MEN!!!!!!!
I could care fucking less who is and isn’t a feminist. I don’t think it’s important. Obviously, yes, I would love it if everyone stood up and declared themselves as such, but it also doesn’t mean that you’re actually going to do anything. I think even saying you’re a feminist is great, but I also hate this “WOMEN ARE DOING GREAT!!!” vibe that we feel the need to shove into every award show on TV. Because it’s not true. It means nothing to me.
And as far as men go, I fucking laugh when I hear someone say “well I’m not a feminist because I don’t want women to be above men” because nothing shows how little you understand about women’s position in this society if you think women could EVER be ABOVE men. At least, it’s not happening in my lifetime, or my children’s lifetime, or my grandchildren’s lifetime. It’s not.
But then I also hate when people counter that with “feminism isn’t about hating men! it’s about equality!!!” because feminism isn’t about equality to me. There is no “equal” now, not the way this society is set up. There is no such thing. Feminism, to me, is reconstructing this entire culture for everyone that is oppressed. I don’t think you’re a feminist if you’re not also educated in racial inequality, gay rights, trans* rights, classism, ableism, and everything else that’s telling people that they are less than. Or that they deserve violence, or other people have the right to not treat them with respect.
So I’m fine that Kirsten Dunst doesn’t think she’s a feminist. I’m okay that Meghan Trainor is the way she is. Yeah, it sucks, but so does this fake feminism, where we praise ourselves for listening to a woman a few times, then make jokes about Ariana Grande being underage and having sex with her.
This problem is so much bigger than women not calling themselves feminists. Instead of calling them idiots or dumb bitches, let’s move on, focus on why these women are held to a higher standard of feminism than men. And even if everyone woke up tomorrow and declared themselves feminists, nothing would actually change. It goes way deeper than just that.
I love feminism, and I am grateful for it. It is so important to me to empower women and girls, and to educate men. Feminism gave me so much that I didn’t already have. But the fact is that we still live in a culture where finding feminism was lucky for me, and even now when the biggest pop star in the world says she’s a feminist, I don’t think girls are getting the bigger picture. There are SO MANY WAYS to be a woman, not just a skinny white straight girl, and EVERY SINGLE ONE needs to be present and celebrated!! Let’s stop holding women to a different standard than our men. Please. Dear God.
If I see one more fucking “THIS WOMAN ISN’T A FEMINIST BURN HER!!!!!!” article I’m gonna tear my goddamn hair out.