asked 166k views
0 votes
How did Florida become a territory of the United States in 1821?

Spain sold the territory of Florida to the United States.

The United States traded Georgia to Spain for Florida.

The United States won the war against Spain and took over the territory.

Spain gave the territory to the United States after the First Seminole War

asked
User Suzana
by
8.5k points

2 Answers

2 votes

Answer:

I believe A

Step-by-step explanation:

On Edg

answered
User Bejoy George
by
9.4k points
3 votes
Spain sold the territory for $5 million to the U.S.
answered
User Henry Navarro
by
8.3k points

No related questions found