asked 80.6k views
5 votes
How did the treaty of paris change the united states?

1 Answer

2 votes

Answer:

The Treaty of Paris was signed by U.S. and British Representatives on September 3, 1783, ending the American Revolutionary War. Based on a 1782 preliminary treaty, the agreement recognized U.S. independence and granted the U.S. significant western territory.

Step-by-step explanation:

answered
User Heah
by
8.0k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.