asked 112k views
2 votes
What first drew Americans out the west

2 Answers

0 votes
The belief that settlers were destined to expand to the west is often referred to as Manifest Destiny
answered
User Fractalism
by
8.0k points
3 votes
they thought god was leading them to the west
answered
User Andy Donegan
by
8.3k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.