asked 17.4k views
3 votes
How did World War I impact the lives of women in the United States following World War I

2 Answers

6 votes
Women took of new roles
answered
User Kabir Sarin
by
8.5k points
3 votes

Answer:

women took on new roles in the work force, notably in war production and agriculture

answered
User Khuttun
by
8.0k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.