asked 26.3k views
19 votes
How did the world wars change womens role in American society?

1 Answer

3 votes
Answer:
It created jobs for them
Step-by-step explanation:
The men had to go off to fight while the women were teachers and worked in factories
answered
User Adivasile
by
7.9k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.