asked 134k views
15 votes
After WWI, what did Germany feel it needed to do as a country?

Group of answer choices

provide more jobs and build schools

reclaim territory and regain prominence in Europe

start another war

support the people and give financial assistance

1 Answer

2 votes
I think reclaim territory and regain prominence in Europe
answered
User Vanesa
by
8.2k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.