asked 23.2k views
1 vote
How did World War 1 change the US?

asked
User Vahdet
by
7.6k points

1 Answer

2 votes

Answer: The experience of World War I had a major impact on US domestic politics, culture, and society

answered
User Thealon
by
8.0k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.