asked 63.3k views
1 vote
What did americans believe manifest destiny was

1.)they believed they should move west and own land
2.)they wanted to own the land north of them
3.)they wanted to own the land south of them​​

2 Answers

1 vote

Answer:

1

Step-by-step explanation:

manifest destiny is the idea that the United States is destined—by God, its advocates believed—to expand its dominion and spread democracy and capitalism across the entire North American continent.

answered
User Monkrus
by
8.0k points
0 votes

Answer:

1

Step-by-step explanation:

They believed they should move west

answered
User Melique
by
8.5k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.