asked 224k views
4 votes
which important document announce that the American colonies no longer wish to be part of the British Empire

2 Answers

1 vote

Answer:

D

Step-by-step explanation:

answered
User Keleigh
by
7.8k points
4 votes
The Declaration of Independence announced that the American colonies no longer wish to be part of the British Empire.
answered
User Gin
by
8.1k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.