asked 110k views
0 votes
What happened to most native americans as the west was settled?

1 Answer

7 votes

As whites settled the American West, Native Americans were pushed off of their ancestral lands and confined to reservations. This process was often accompanied by fighting between the Native Americans and the US Army. After this process was over, the US started to try to assimilate Native Americans by destroying their culture.

answered
User Antonio Dias
by
8.0k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.