asked 187k views
5 votes
How did working during world war || change american women

asked
User Calmrat
by
8.7k points

1 Answer

4 votes

Answer:

Though women had been joining the work force in greater numbers since the hardships of The Great Depression, the entry of the United States into World War II completely transformed the types of jobs open to women. Before the war, most working women were in traditionally female fields like nursing and teaching.

Step-by-step explanation:

answered
User Ankur Mishra
by
8.5k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.