asked 112k views
2 votes
HELP!

What does the term manifest destiny mean, and how did it influence the Westward expansion of America?

1 Answer

3 votes

Answer:

Manifest Destiny was a popular belief in the mid-to-late 19th century. Its proponents claimed that the United States had the divine right to expand westward—meaning that U.S. expansion was the will of God.

answered
User Ernst Zwingli
by
7.2k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.