asked 47.8k views
5 votes
Manifest Destiny was the belief that the United States should

1 Answer

6 votes

Answer: Manifest Destiny, a phrase coined in 1845, is the idea that the United States is destined—by God, its advocates believed—to expand its dominion and spread democracy and capitalism across the entire North American continent

answered
User Shanyu
by
7.8k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.