asked 163k views
1 vote
How did America entering world war 1 change America?

1 Answer

9 votes

Answer:

The experience of America entering World War I had a major impact on U.S. domestic politics, culture, and society. For example, women worked in factories and replaced men's roles in society, achieving the right to vote, while other groups of American citizens were subject to systematic repression.

Step-by-step explanation:

I hope this helps!

answered
User Sheldonhull
by
8.1k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.