asked 146k views
3 votes
How did the role of women in the U.S. change in the 1920's?

2 Answers

4 votes
women gained the right to vote and thus waves of feminism started
answered
User Joe Osborn
by
8.7k points
3 votes

Answer: They were given the right to vote and gain many other rights they didn't have before. They also now could serve in jobs where before only men could work in. They became more independent from the many new things they could do.

answered
User Gsg
by
8.3k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.