asked 88.8k views
2 votes
How did World War 2 change the role of women?.

1 Answer

3 votes

Step-by-step explanation:

World War 2 led many women to take jobs in defense plants and factories around the country and changed role of women.

hope it helps

answered
User James P
by
7.9k points

No related questions found