asked 171k views
2 votes
How did the United States change after World War II

1 Answer

4 votes

Answer:The entry of the United States into World War II caused vast changes in virtually every aspect of American life. ... Building on the economic base left after the war, American society became more affluent in the postwar years than most Americans could have imagined in their wildest dreams before or during the war.

Step-by-step explanation:

answered
User Priboyd
by
7.7k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.