asked 78.7k views
3 votes
How did America get Florida?

asked
User Ku
by
8.9k points

1 Answer

5 votes
In 1763 the Treaty of Paris was signed by England, France and Spain and it resulted in England gaining the Florida Territory. But when England formally recognized the colonies' independence (as the United States) in 1783, the Florida Territory was returned to Spain without clear definition of its boundaries.
answered
User Wronex
by
8.7k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.