asked 13.7k views
2 votes
What happened to German territory in the east after WWI?

asked
User Cereal
by
7.6k points

1 Answer

6 votes

Germany lost World War I. In the 1919 Treaty of Versailles, the victorious powers (the United States, Great Britain, France, and other allied states) imposed punitive territorial, military, and economic provisions on defeated Germany. ... In the east, Poland received parts of West Prussia and Silesia from Germany.

answered
User Chris Simmons
by
7.8k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.