asked 200k views
3 votes
What were some of the effects of the explorations of the west in the 1800s?

1 Answer

4 votes
I think the "west" here refers to the West of the US.

First, exploration brought some knowledge of the west of the US,
Then, it lead to the disappearance of the Indigenous cultures in the west of today's US

Then, it lead to Europeans and Americans of European descend to settle in those areas.
answered
User Orquesta
by
8.0k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.