My initial thought is that the stereotypes of the genre hurt it quite a bit. People assume it will be about bank robberies, killing Native Americans, or stand offs. It's all been done before -- the setting doesn't seem to provide much. If you want frontiers you go to space, near futures, etc.
Perhaps more importantly, I just don't think the mythological Old West holds much weight or fascination anymore. Even when I lived in rural Texas, people walking down the street in a cowboy hat were snickered at unless it was an elderly gentleman. I think it largely comes down to the myth being dead. Nobody wants to be John Wayne any longer, the idea of a lone cowboy sitting at a campfire simply isn't that appealing.
Perhaps I could be wrong, but that is the general feel I get when I hear people react to someone else suggest watching a Western.