I've been watching the Walking dead since it has appeared on TV and I have really mixed feelings. First seasons were enjoyable when they were surviving and fighting for loves, but then the show turned into a soap opera. I think that it should have been ended a few seasons ago. I understand that it's based on a comics, but still, it would have been a great decision.