asked 111k views
1 vote
What social and economic changes arose from the war?

asked
User Natiiix
by
7.7k points

1 Answer

2 votes
WWII went a long way toward ending the Depression, and ushering in an era of prosperity. There was greatly expansion of home ownership and suburbs. Socially it marked the beginning of increased freedom and opportunity for women, blacks and poor men. Women had worked in factories and gained greater confidence while blacks pointed to the hypocrisy of the US opposing nazi racist ideas while tolerating racism here. In addition the GI bill meant much greater opportunities and upward mobility for men of all socioeconomic backgrounds, as higher education was no longer for just an upper class elite.
answered
User Personman
by
8.2k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.