asked 96.8k views
5 votes
How were german americans treated during WWI

asked
User Dboswell
by
7.6k points

1 Answer

3 votes

Answer:

After WW1, Germany was treated as a pariah, humiliated by the victors, lost territory and forced to pay huge reparations.

Step-by-step explanation:

answered
User Petras Purlys
by
8.6k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.