asked 118k views
4 votes
What lasting political changes were brought about by World War 1? PLZ ANSWER RIGHT NOW!!

1 Answer

5 votes
The Great Migrations brought African Americans and Mexican Americans to northern cities, women had more rights (most of which vanished when the war ended), education improved, people were more disciplined, the economy was booming (only for a little bit-- then the Great Depression came), and anti-German feelings were abundant among Americans.
answered
User Farhana
by
7.9k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.