0 Comments
Answer:
The war greatly changed the role of women in America. Before the war, women were restricted to a domestic life. This means that women were discouraged