asked 121k views
11 votes
For many European countries, the end of WWI was the beginning of

1 Answer

7 votes

Answer: the peace treaty that officially ended the conflict—the Treaty of Versailles of 1919—forced punitive terms on Germany that destabilized Europe and laid the groundwork for World War II.

Step-by-step explanation:

answered
User Zin Win Htet
by
8.8k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.