asked 144k views
5 votes
How did the US claim Florida? Who owned Florida in the first place? Why did they give up the land to the US?

2 Answers

1 vote

Answer:

Ikd Florida is the male genitalia for the U.S.

Step-by-step explanation:

YW

answered
User BenoitVasseur
by
7.7k points
7 votes

Answer:

The U.S. purchased Florida with the signing of Florida Purchase Treaty. Spain and Britain owned Florida at first. Spain did not need Florida, and it was burden to them. So they got rid of it with the U.S.

Step-by-step explanation:

I hope this helps.

answered
User Adrian Klaver
by
8.8k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.