bleak movies/TV/books?

Feb 05, 2008 17:12

I'm in the middle of teaching naturalism to my American lit students, and as I try to figure out if there is any merit to that type of literature, as dark as it is--or why anyone would want to read it--I keep wishing for contemporary examples to point my kids back to to make the discussion richer.

The problem? I can't right off the top of my head think of any.

Naturalism is an offshoot of realism.

Realism wants to pay close attention to the details of everyday life, usually groups or ideas previously ignored as not being worthy of writing literature about, and objectively paint a picture of them for the reader, like a documentary almost. It's the opposite of romanticism with all its figurative language and symbolism. Realism is often kinda depressing (because the world is), but there's a bit of hope--characters have responsibility over their actions and, therefore, to a certain extent, their fate.

Naturalism does similar things in picking topics and recording life, but its outlook on personal responsibility is different. It says that forces bigger than you--like nature or the socio-economic system (especially in big cities)--are more in control of your world than you are. Oh, and God doesn't care about you or doesn't exist at all. We're talking serious bleakness, here.

Can you think of any movies, TV shows, or books my 19 and 20 year olds might know? All we came up with as a class was examples of moments of this kind of thinking (like the dad in Signs, mid-movie, being all cynical and athiest) that invariably turn okay in the end. I need something unrepentantly cynical about human nature and God concepts. (Don't get me started on 'Lost.' I've typed up a lot of meta and deleted it in this post already. And my kids already came up with that, with no coaching from me. :) )

Help?

teaching

Previous post Next post
Up