asked 132k views
5 votes
What political effects did World War I have on the United
States?

asked
User Syam
by
8.0k points

1 Answer

5 votes

Answer:

The experience of World War I had a major impact on US domestic politics, culture, and society. Women achieved the right to vote, while other groups of American citizens were subject to systematic repression.

Step-by-step explanation:

we learn it in school?? Especially the part with women's rights.

answered
User Chthonic Project
by
9.0k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.