asked 34.9k views
1 vote
How did the role of women in the United States change during and after World War II?

asked
User Dasher
by
7.7k points

1 Answer

6 votes
the role changed during world war two because women were allowed to have jobs since their husbands and male family was at war. originally the plan was for things to go back to the way they were. However, women started to speak out against the idea when the war ended, thusly letting them keep the jobs. later women got all the same rights as men.

I hope this helps.
answered
User Riba
by
8.7k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.