asked 235k views
3 votes
how did womens roles in countries such as the united states and Britain change after world war 1 check all that apply

asked
User Ebenezer
by
8.2k points

1 Answer

5 votes

Answer:

Society became more open, and women experienced greater freedom.

Women began to seek out new careers.

Women challenged old traditions by doing things such as changing their clothing style.

answered
User Colin Anthony
by
7.6k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.