asked 112k views
0 votes
Manifest destiny is the belief that

2 Answers

1 vote
Manifest Destiny was the idea that Americans were destined, by God, to govern the North American continent. This idea, with all the accompanying transformations of landscape, culture, and religious belief it implied, had deep roots in American culture.
answered
User Yutao Huang
by
8.1k points
3 votes
Answer: I don’t know I’m trying
answered
User Dries Cleymans
by
8.6k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.