asked 90.6k views
5 votes
What bad changes happened in the USA after world war 2

2 Answers

6 votes
After the war ended in the summer of 1949, soldiers had began to come home to their families. Overtime, industry's had begun to stop the making of war equipment. It was more peaceful. And the economy was stronger than ever!
answered
User Williamsandonz
by
7.9k points
5 votes
The aftermath of World War II was the beginning of an era defined by the decline of the old great powers and the rise of two superpowers: the Soviet Union (USSR) and the United States of America (USA), creating a bipolar world.
answered
User Gazler
by
8.3k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.