asked 161k views
2 votes
How do you think the changing roles of women in the United States were reflected in their experiences during wartime?

1 Answer

2 votes

women in the Us started using there skills and knowledge to maintain there jobs while men usually held there job while men was at war.
answered
User Methnani Bilel
by
8.1k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.