asked 89.8k views
19 votes
After World War I, nativism became more prominent in America. What is nativism?

1 Answer

4 votes

Answer:

Nativism was an anti-immigration movement that favored those descended from the inhabitants of the original thirteen colonies.

Step-by-step explanation:

answered
User Dhruv Ramani
by
8.7k points

Related questions

1 answer
3 votes
202k views