asked 53.1k views
5 votes
What permanent changes to the federal government resulted from the New Deal?

1 Answer

4 votes

Answer:

The New Deal redefined the role of the government, convincing the majority of ordinary Americans that the government not only could but should intervene in the economy as well as protect and provide direct support for American citizens.

Step-by-step explanation:

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.