asked 48.0k views
0 votes
what are some positive and negative changes that occured in the United States in the years after World War 2?

1 Answer

4 votes

Answer:

American society became more affluent in the postwar

Step-by-step explanation:

answered
User Kasun Kodagoda
by
8.2k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.