asked 188k views
13 votes
The nazis did not come to power in Germany promising to:

1 Answer

3 votes

Answer:

The Nazi Party was one of a number of right-wing extremist political groups that emerged in Germany following World War I. Beginning with the onset of the Great Depression it rose rapidly from obscurity to political prominence, becoming the largest party in the German parliament in 1932.

Step-by-step explanation:

answered
User Eveningsun
by
8.2k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.