asked 4.3k views
2 votes
What happened in the United States after the Europeans colonized (or took) the land from the Native Americans? That is, after taking the land from the Native Americans, the Europeans (now called Americans) had millions of acres of land.

1 Answer

4 votes

Answer:

Colonization ruptured many ecosystems, bringing in new organisms while eliminating others. The Europeans brought many diseases with them, which decimated Native American populations. Colonists and Native Americans alike looked to new plants as possible medicinal resources.

hope this helped

answered
User JMabee
by
7.9k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.