First of all, slooooow zombies only- though I'll allow for the Infected in 28 Days Later and the first half of 28 Weeks Later, before they decided to toss out both the story line and the good actors.
My interest is in What Happens After- I used to love The Walking Dead, and still face every new season like the start of a love affair- with the inevitable disillusion that follows- one of these days, though....
Was hoping to see it deal with actual issues in the ZA- the first and second seasons had glimpses- especially, say, feminism- how do modern women adapt to a world that suddenly throws them back into the kind of situation that led to patriarchy in the first place- very briefly raised in Season 1, then dropped like a hot potato. Or in rural Georgia, they don't come across a community that believes it's God's punishment for gay marriage, abortion and women wearing pants? Or racism- other then Rick's comment to Merle that there ain't no more black or white; there's only dark meat and light meat.
Fear the Walking Dead was a major disappointment- was hoping to see the gradual appearance of the disease, but I guess they figured too many people would say booooring.
Would like to see a very-post-apocalyptic setting, after guns and technology have disappeared. Let's face it, realistically zombies are pretty pathetic enemies, which is why the survivors in these shows keep having to do incredibly stupid things to give the brain-dead Undead an even chance.