asked 187k views
1 vote
The Treaty of Paris gave Florida back to France.
(T) True
(F) False

1 Answer

3 votes

The Treaty of Paris of 1763 ended the French and Indian War/Seven Years’ War between Great Britain and France, as well as their respective allies. In the terms of the treaty, France gave up all its territories in mainland North America, effectively ending any foreign military threat to the British colonies there. So. It is false

answered
User Catalin Iancu
by
7.5k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.