asked 143k views
2 votes
What are some of the positive things that happened in America as a result of WWI? What new

opportunities
did people have?

asked
User Puczo
by
9.2k points

1 Answer

6 votes
Women’s rights was something positive born out of WWI, women were needed to do more than stay at home and were asked to apply to factory jobs due to the boom in weapons, loss in men due to the enlistment, and the economy being in need of support.
answered
User Fikret
by
8.0k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.