asked 45.5k views
2 votes
What do the word "american frontier" refers to

1 Answer

4 votes
The American Frontier refers to the concept of the culture, geography, history, and folklore, of the 17th century westward expansion of the colonists.
answered
User BigL
by
8.2k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.