asked 137k views
2 votes
World War II was one of the most significant events of the 1900s and one of the most important events in US history. Think about how much the United States changed between the Great Depression and the postwar era, when the country had become an economic powerhouse. How did World War II influence and change the identity of the United States throughout the 1900s and into the present? What are some positive and negative changes that occurred in the United States in the years after World War II?

asked
User Denesha
by
7.2k points

1 Answer

3 votes

Answer:

some negative effects are that the soviet union and the us fought in the cold war because of world war two and a positive effect is that the us was getting stronger. hope this helps i love history.

answered
User Jasiustasiu
by
7.5k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.