asked 60.0k views
0 votes
How did German people feel about their nation after world war 1

2 Answers

2 votes
Felt anger towards the kaiser because they blame him for losinig wwI that is also how Hitler arose to power
answered
User Vivek Kalkur
by
7.9k points
5 votes

After the First World War, Germany had become a ruined, humiliated and politically unstable state. The war had left the people exhausted and their mood was deplorable.

The situation worsened on June 28, 1919, with the signing of the Treaty of Versailles between Germany and the allies, because of the country's delegation, the newspapers and the people understood the treaty as an act of imposition, and not as a negotiation.

answered
User Battlmonstr
by
7.9k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.