asked 192k views
4 votes
What social changes took place in the United States after WWII? What role did the war play in those changes? PLEASE HURRY!!

1 Answer

3 votes

Answer:

After World War II, the United States was in a great economical position than it was before war. After the war is when the United States became a global superpower as it. The United States became a global influence in economic, political, military, cultural, and technological affairs.

answered
User Ratsbane
by
8.1k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.