asked 94.7k views
1 vote
What was the role of America after WWI?

1 Answer

6 votes

Answer:

Under President Woodrow Wilson, the United States remained neutral until 1917 and then entered the war on the side of the Allied powers (the United Kingdom, France, and Russia). The experience of World War I had a major impact on US domestic politics, culture, and society.

Step-by-step explanation:

answered
User Jonsidnell
by
9.4k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.