asked 26.2k views
3 votes
The colonies that became the original United States were part of which European nations land claims

1 Answer

6 votes
The colonies that became the original United States were part of England, that is, Great Britain.
answered
User Mgadda
by
8.3k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.