asked 191k views
5 votes
How did World War II transform American society?

1 Answer

2 votes
Hey there,
1) WWII helped the United States out of the Great Depression by providing arms to other countries.
2) Women were forced to work as there were not enough men in the work place

Hope this helps :))

~Top
answered
User TmTron
by
8.5k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.