asked 203k views
1 vote
Women had many new experiences as a result of world war i, including working at new jobs, wearing new fashions, and acting more independently. what other new change came to women just after wwi?

asked
User KVK
by
8.4k points

1 Answer

6 votes
After World War I, women became increasingly part of mainstream life and eventually lead to the Women getting the right to vote.

Before the war, the daily lives of women were based on traditional roles of a mother, a house wife etc.

However, with the War, things changed dramatically eventually leading to women earning the right to vote in general elections.
answered
User Kisna
by
8.3k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.