asked 72.8k views
5 votes
What does the term manifest destiny imply?

2 Answers

0 votes
The belief or doctrine, held chiefly in the middle and latter part of the 19th century, that it was the destiny of the U.S. to expand its territory over the whole of North America.
answered
User Tommaso Barbugli
by
7.8k points
6 votes
Manifest Destiny was the idea that the people of America had a mission from God to conquer the continent form East to West. In all reality, it was a word used to rationalize the American peoples hunger for land and the westward expansion. They also used this term to rationalize the 'westernization' of the indigenous Indians they discovered as they traveled.
answered
User Ali Helmy
by
7.8k points

Related questions

asked Jun 23, 2016 200k views
Coomie asked Jun 23, 2016
by Coomie
8.2k points
2 answers
3 votes
200k views
asked Aug 27, 2018 164k views
Yogeesh H T asked Aug 27, 2018
by Yogeesh H T
8.3k points
2 answers
0 votes
164k views
asked Aug 23, 2018 229k views
David Barry asked Aug 23, 2018
by David Barry
9.1k points
2 answers
3 votes
229k views
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.