asked 59.7k views
0 votes
How did World War I change the US?

How did World War I change the US?-example-1

2 Answers

4 votes

A. It brought the US out of a recession and into a period of prosperity.

answered
User Michael Dz
by
8.3k points
5 votes

Answer:

World war I was very significant in the history and life of Americans. It caused the United States to forever end American isolationism.

answered
User Adam Starrh
by
8.9k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.