asked 141k views
1 vote
In which continent did England become dominant after the French and Indian War

asked
User Halllo
by
8.0k points

1 Answer

4 votes
north america is the answer
answered
User Nithin Baby
by
8.0k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.