asked 16.0k views
1 vote
How did World War One led to significant changes in society

asked
User Hanggi
by
8.3k points

1 Answer

4 votes

World War One changed the lives of America socially, politically and economically. The war had a massive impact on almost every aspect of society, particularly women, workers and minorities. The American public felt a great sense of nationalism and patriotism during the war as they were unified against a foreign threat. However, it also led to constant scrutiny and racial prejudice against minority groups such as the South Eastern Europeans.

answered
User Mafue
by
8.1k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.