Shad
Veteran Member
You're most likely correct, although I don't think America would be in a bad spot if WW2 never happened.
Sure. Some advantages are geographical. Obviously the US has resource superiority compared to the UK or any other small nation-state. It did prior to the war.
But both World Wars helped shape Americans' perceptions of Europe and the rest of the world, along with saddling us with the burden of "making the world safe for democracy." I don't think many Americans relished that role or really wanted it - at least not at first.
Sure. However in hindsight 20/20 no one else that was acceptable was going to step up. It would take a crisis which would lead to remilitarization after disarmament after WW2. The exact issue before WW2 in which unprepared nations were not willing to take action until it was too late.