asked 16.4k views
2 votes
What changed in the US government after World War One?

1 Answer

3 votes
After World War I, the US government did not anymore interfere that much with the affairs of the European countries like it did before the World War and during it. It concentrated more on its own internal affairs.When former president Harding was elected during the presidential elections, this all became possible. The people had more access to technological enhancements. Workers also had higher salaries. The US government as a whole concentrated on making America a better nation than poking at the affairs of other countries. 
answered
User Matthew Wesly
by
7.9k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.