asked 85.1k views
1 vote
How did the great depression eventually change germany politically america enters the war?

asked
User Olsydko
by
9.1k points

1 Answer

1 vote
The Great Depression changed Germany politically by the formation of the Nazi party and the development of fascism or a complete dictatorship of the Germany's elite industrialists and they blamed the Jewish people on their economic problems thus deflecting criticism of bringing it on themselve ie the big industrialists. As for America entering the WWII, of course that was when the Japanese fascists attacked and burned Pearl Harbour thus forcing the US to respond and go to war.

answered
User Travis Mehlinger
by
7.9k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.