menu
Qamnty
Login
Register
My account
Edit my Profile
Private messages
My favorites
Did the war change the role of women in American society
Ask a Question
Questions
Unanswered
Tags
Ask a Question
Did the war change the role of women in American society
asked
Oct 11, 2019
84.1k
views
1
vote
Did the war change the role of women in American society
History
high-school
The Cookies Dog
asked
by
The Cookies Dog
8.5k
points
answer
comment
share this
share
0 Comments
Please
log in
or
register
to add a comment.
Please
log in
or
register
to answer this question.
1
Answer
2
votes
Women's work in WW1. During WWI (1914-1918), large numbers of women were recruited into jobs vacated by men who had gone to fight in the war. New jobs were also created as part of the war effort, for example in munitions factories.
Analia
answered
Oct 15, 2019
by
Analia
8.3k
points
ask related question
comment
share this
0 Comments
Please
log in
or
register
to add a comment.
← Prev Question
Next Question →
No related questions found
Ask a Question
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.
Categories
All categories
Mathematics
(3.7m)
History
(955k)
English
(903k)
Biology
(716k)
Chemistry
(440k)
Physics
(405k)
Social Studies
(564k)
Advanced Placement
(27.5k)
SAT
(19.1k)
Geography
(146k)
Health
(283k)
Arts
(107k)
Business
(468k)
Computers & Tech
(195k)
French
(33.9k)
German
(4.9k)
Spanish
(174k)
Medicine
(125k)
Law
(53.4k)
Engineering
(74.2k)
Other Questions
What goal of the constitution was also a goal of the Magna Carta?
is it true or false that after the american revolution conflicts in the northwest territory erupted between remaining british soldiers and native americans
Who made dutch claims in north america?
Twitter
WhatsApp
Facebook
Reddit
LinkedIn
Email
Link Copied!
Copy
Search Qamnty