The Walking Dead
I finally got to see the entire season of Walking Dead. These were the best looking zombies I’ve ever seen. The make up was fantastic. But what really impressed me was that the show really shows that the humanity of the zombies. What I mean by that is that the show doesn’t portray them as comedic, or shuffling targets. The show makes them out to be people. They are the dead bodies of loved ones, people who had lives. Every walking corpse is a tragedy with a story. The main character puts forth a lot of effort to respect the dead and remember them. This adds to the dread. No longer are these faceless zombies, but they are dead people. More than any other zombie show I’ve seen, The Walking Dead comes closest to show how terrifying a walking corpse would really be. I really dislike seeing dead bodies and to me a walking dead would be absolutely terrifying in real life. The drama is good with great and complex interaction between the characters. Though they’re decision making often seems to be detached from real life. When going through the broken military barricades, there’s equipment, humvees and guns lying all around and not once do any of them even pause to pick something up. Why aren’t they arming the other women and teaching them to shoot? You’d think self defense would a higher priority given the situation. Also, I have no respect for character that simply give up. The guy who was bitten, I can understand, but the black lady who decides to stay behind and get blown up? Its a weak and cowardly way out and I have no sympathy. Other than that, its a great show and I love it. Now I’m just waiting for the second season.