asked 123k views
2 votes
How did world war ii change the role of corporations in american life?

2 Answers

4 votes
Hey there,
World War II transformed the role of the federal government and the lives of American citizens.They secure industrial peace and stabilize war production, the federal government forced reluctant employers to recognize unions.

Hope this helps :))

~Top
answered
User Zach Russell
by
8.7k points
2 votes
World War II changed the role of corporations in americans life by stopping all the wars and just moving on with there lives and they didn't know what to do after that. Hope this helped, have a great day! :D
answered
User Tszming
by
8.3k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.