swalterh
Corporal
- Joined
- Apr 23, 2012
- Messages
- 433
A question that has always intrigued me is weather the U.S.A would have delcared war against Nazi Germany after the Pearl Harbour attack had not Hitler rashly delcared war on the U.S first? (The third of his big blunders, the first his failure to invade the U.K. The second his invasion of the U.S.S.R)
My reading of history suggests that there was no mood for getting involved in a European war amongst the U.S population at the time and I have often wondered if that mood may have changed if Hitler had not declared war, I think myself the U.S people would have been rightly concerned with dealing with Japan first and leave Germany to the British and Soviets, any thoughts.
Wayne.
Wayne
This was a question that we spent a good bit of time debating during one of my history courses. The good thing is that in reality Hitler made it rather easy for Roosevelt politically by declaring war on the US. The facts as you suggest, lead one to believe that Roosevelt would have experienced significant challenges when attempting to convince the American people that a war against Germany was also necessary had Hitler refrained from his declaration of hostilities.
During the early years of the conflict (1939-1941) there existed a sour taste in the mouth of the American public due to the percieved lack of gains from their efforts in the Great War. This was one of the more prevalent reasons behind the staunch isolationism inherent in the US prior to the Japanese attack on Pearl Harbor. Despite this, it is highly probable that due to the truly global nature of the conflict and the German-Japanese alliance, it would have been very difficult for the US to remain uncommitted at least in some form to the war in Europe. IMO Roosevelt, a truely masterful politician (he did win the Presidency 4 times, no small feat), would have found some way to convincingly commit American forces to operations against the Germans.
Shane