asked 17.3k views
2 votes
What lands did the British gain?

1 Answer

3 votes
Britain now claimed all the land from the east coast of North America to the Mississippi River. Everything west of that river belonged to Spain. France gave all its western lands to Spain to keep the British out.
answered
User Angga Ari Wijaya
by
7.9k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.