asked 119k views
3 votes
According to the declaration of independence who has taken away the rights of americans

asked
User Danziger
by
7.7k points

1 Answer

3 votes
Great Britain was the one who did not grant the Colonies freedom until the Revolutionary War.
answered
User Kabilesh
by
7.4k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.