asked 174k views
1 vote
How did American industry change from the beginning of WW2 to its end?

asked
User Epattaro
by
8.2k points

1 Answer

4 votes

Answer:

During the war,the roles of women became more important,due to the fact that the men were sent away to either fight or make war efforts,causing women and their actions to be more recognised,and empowering them,even after the war.

Step-by-step explanation:

hope this helps

answered
User Stas Boyarincev
by
8.7k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.