asked 27.8k views
7 votes
4. How did the war benefit Americans?

1 Answer

11 votes

The war brought full employment and a fairer distribution of income. Blacks and women entered the workforce for the first time. The war also brought the consolidation of union strength and far-reaching changes in agricultural life.

answered
User Ingmars
by
9.0k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.