asked 180k views
3 votes
How did the role of women change in post-war Society

1 Answer

4 votes
roles of women changed in post war society by increasing women in the work place because most of the men were at war.
answered
User Bibliophilsagar
by
7.7k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.