asked 31.3k views
3 votes
How did world war i change the role of government in the united states?

1 Answer

4 votes

Answer:

It produced a greater relationship between the government and private industry. The United States had no major battles or attacks on its soil.

Step-by-step explanation:

Hope this helps!

answered
User Tobias Lorenz
by
7.8k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.