I just watched Episode 1 of
The Walking Dead television series. I found it enthralling, and scary as hell. I love stuff like that. I also have had lots of lucid nightmares about being caught in a zombie Apocalypse, getting bitten, getting sick, knowing I'm going to die. And from what I've heard, both are true of millions of Americans.
Now, why, dammit? What in the world is so compelling about zombies and, worse, a zombie apocalypse? This has got to be something straight from the collective unconscious, but I wish I knew what.