Oct 25, 2005 11:24
I find it a little unnerving when i meet a woman who feels there is a stigma attached to feminism or being a feminist, could it be because of the following?
At school, i remember being taught about every mainstream religion, a tiny bit on the old philosophers and different cultures of the world.
I was taught on a few things women did to strive for equallity, mainly, the sufragettes, but i dont recall the slightest explanation on what feminism was.
I ACTUALLY remember at the age of 13/14 saying i didnt like feminists because i thought they were anti-men, and were simply jealous of 'beautifull' women (im now 27, and know better).
I simply accepted the mainstream portrayal of women in the mass-media as normal, if i heard a woman complain, then she was going over the top and reading too much into it. It wasnt until i was about 17/19 that i thought any different.
Were any women or men here ever under a similar impression? Did you have to 'discover' feminism? Or did i miss the lessons we had?
language,
feminist mvmt general