asked 52.8k views
5 votes
A major change women experienced during the post world war 1 era was that they started

2 Answers

3 votes

Answer:

They began to work outside the home

Step-by-step explanation:

Took the test

answered
User John Polo
by
8.9k points
4 votes
Since the men were gone at war, women began to work as nurses, factories, and other work places.
answered
User Yin Gang
by
7.2k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.