asked 24.9k views
2 votes
What changes began happening for America once world war 2 was over

2 Answers

3 votes
The Red Scare and Cold Was came after WW2. Anti-Communist sentiments grew quite quickly which led to many witch hunts for communists in the government. Famous ones like Hollywood 10 and McCarthyism was popularized
answered
User Sardar Khan
by
7.8k points
1 vote
Economic prosperity was the best thing that happened after the war for the USA. Other changes included minorities trying to fight for their civil rights including African Americans, Latinos and women.
answered
User Eero
by
8.0k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.