So ever since i found out my mother was once an anorexic (and an unrepentent one at that -- she'd go back to starving herself if she thought it would help), I've had a fascination with the way our culture influences the way women perceive their ideal bodies. It's a feminist issue too, which only makes finding things more interesting.
Take
this article for example. It seems that eating disorders are a natural part of the female existence here in the US. Is anyone really surprised? I remember other girls eating croutons or crunchy noodles for lunch, and how popular the salad bar was simply because of the idea that salad = weight loss. Not that I'm against salad. I'm a vegetarian after all. But still, it's not for love of lettuce that those girls were always downing salads. It's creepy that we as Americans se nothing wrong with the sort of diet religion that is advertised everywhere. Come to think of it, I don't think I know of a single woman who isn't on a diet of some sort right now. Creepy.
And I believe this was Sexual Assault Awareness Month, if
metafandom is to be believed. So yeah, lots of feminist issues taking center stange all of a sudden.