asked 184k views
3 votes
In the 1800's many americans believed in manifest destiny. what does that term mean?

2 Answers

5 votes
Manifest destiny was what Americans thought was God's will for them to expand and conquer the land to the west.
answered
User Erykah
by
8.0k points
3 votes
That it was obvious that God wanted the USA to expand and take over the whole of the North American continent.
answered
User Bulent
by
9.1k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.