Hello, Chrons.
I already know this question doesn't really have a proper answer, but I thought it might be fun to discuss.
I've read a couple of novels and seen a few films that tackle the idea that aliens, or Big Nasties of some description, invade Earth and all the nations of the Earth put aside their differences and fight the aforementioned Big Nasties together in a unified front of Humanity, probably to the music of ACDC or something; these stories usually end with America saving the world with tactical applications of 'Merica-ness and Patriotism single handed while all the other nations of the world swoon, but that's not the point here.
Now, here's what I'm thinking, would the nations of the Earth really put aside their centuries old, hardwired tribal nature and fight aloneside each other? My opinion would be a gritty, in your face, no.
Okay, we've got NATO for example, but countries technically don't have to do what NATO says, and let's be honest, if something as serious as an alien invasion happened, the governments of the world would be looking to them and theirs.
Can you see France taking orders from America? Or America taking orders from well, anyone? Or heck, let's go to the polar opposite, can you suddenly see India and Pakistan fight together? Or South Korea and North Korea suddenly join forces? Russia will suddenly shake hands with the West and we'll all be buddies and fight this thing together?; to quote one of my favourite TV doctors, Dr. Cox from Scrubs, when discussing what people are: "Do you know what they are mostly? Bastards. ******* coated bastards with ******* filling."
I only made it two series in before giving up, but when I used to watch The Walking Dead my wife would say things like, "I don't get it, why when you've got a zombie apocalypse would you bother fighting each other? You'd work together." Please see above quote.
In my heart of hearts I'd like to think that Humanity will grow up one day, but I doubt it.
What sayeth you?
I already know this question doesn't really have a proper answer, but I thought it might be fun to discuss.
I've read a couple of novels and seen a few films that tackle the idea that aliens, or Big Nasties of some description, invade Earth and all the nations of the Earth put aside their differences and fight the aforementioned Big Nasties together in a unified front of Humanity, probably to the music of ACDC or something; these stories usually end with America saving the world with tactical applications of 'Merica-ness and Patriotism single handed while all the other nations of the world swoon, but that's not the point here.
Now, here's what I'm thinking, would the nations of the Earth really put aside their centuries old, hardwired tribal nature and fight aloneside each other? My opinion would be a gritty, in your face, no.
Okay, we've got NATO for example, but countries technically don't have to do what NATO says, and let's be honest, if something as serious as an alien invasion happened, the governments of the world would be looking to them and theirs.
Can you see France taking orders from America? Or America taking orders from well, anyone? Or heck, let's go to the polar opposite, can you suddenly see India and Pakistan fight together? Or South Korea and North Korea suddenly join forces? Russia will suddenly shake hands with the West and we'll all be buddies and fight this thing together?; to quote one of my favourite TV doctors, Dr. Cox from Scrubs, when discussing what people are: "Do you know what they are mostly? Bastards. ******* coated bastards with ******* filling."
I only made it two series in before giving up, but when I used to watch The Walking Dead my wife would say things like, "I don't get it, why when you've got a zombie apocalypse would you bother fighting each other? You'd work together." Please see above quote.
In my heart of hearts I'd like to think that Humanity will grow up one day, but I doubt it.
What sayeth you?