asked 111k views
5 votes
the colonists that became the original United States were part of which European nations land claims?

asked
User Imrul
by
8.0k points

1 Answer

7 votes
The united states were originally British colonists. Under the authority of Great Britain.
answered
User Bocercus
by
8.4k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.