asked 84.1k views
1 vote
Did the war change the role of women in American society

1 Answer

2 votes
Women's work in WW1. During WWI (1914-1918), large numbers of women were recruited into jobs vacated by men who had gone to fight in the war. New jobs were also created as part of the war effort, for example in munitions factories.
answered
User Analia
by
8.3k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.