asked 233k views
23 votes
How did life change for women in the United States after World War I started? A) More women had children. B) Women lost the right to vote. C) Women began to earn more than men. D) More women got jobs outside the home.

asked
User ThinkBig
by
8.5k points

1 Answer

4 votes

Answer:

D..................................

answered
User Mearaj
by
8.0k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.