asked 130k views
5 votes
How did World War 1 change America

2 Answers

4 votes

Answer:

Despite isolationist sentiments, after the War, the United States became a world leader in industry, economics, and trade. The world became more connected to each other which ushered in the beginning of what we call the “world economy.”

Hope this helps!!!!

answered
User Lku
by
8.2k points
3 votes

Answer:

Despite isolationist sentiments, after the War, the United States became a world leader in industry, economics, and trade. The world became more connected to each other which ushered in the beginning of what we call the “world economy.”

answered
User Ironchicken
by
7.4k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.