asked 88.3k views
3 votes
The nation that started colonies in America first

1 Answer

6 votes
The first nation to claim land in the Americas was Spain, followed by Portugal. In fact, in 1494 they had an agreement to split Americas among them and it seemed as if Spain would be in control of almost the whole continent.

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.