Could Germany have ever won WWII?

The US were involved in WWII much earlier than the declaration of war. But would the US have declared war on Germany if Hitler hadn't declared war on them? I would say probably not. And after Pearl Harbour, even less so.
 
As I see it…
A war between Japan and the US was inevitable.
Japan wanted to grow its own empire [it had learnt that it needed one from Great Britain and the US and others] and the US didn’t want competition for trade and resources in the Pacific.
The Tripartite Pact was more of an enemy of my enemy is my friend thing.
And everyone should have ignored Mussolini.
 

Similar threads


Back
Top