asked 16.6k views
21 votes
How do you think Germany’s treatment at the end of World War I might influence the rise of Hitler?

1 Answer

5 votes

Answer:

At the end of World War I, Germany drew the short straw. The Treaty of Versailles forced them to give up the countries they previously took control of (i.e. Belgium, Czechoslovakia and Poland), stripped them of their military, placed the blame of the war on the country, and forced them to pay reparations for their damages. Hitler was one of many people who stood against what the Allies were doing to his country, and gained support from its citizens. His fame among the German people fueled the fire of more nationalism and disrespect for the Allied countries. He almost immidetely climbed the ranks of the political side of Germany, and became the Führer.

answered
User Barbarity
by
8.1k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.