asked 127k views
1 vote
Who owned Florida when America bought it?

asked
User TheDrot
by
8.8k points

1 Answer

4 votes
The owners of Florida when America bought it were Great Britain and Spain. Hope this helps you:)
answered
User David Herrero
by
7.5k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.