asked 172k views
3 votes
1. When the United States first became a nation, what did the West mean?

asked
User Hoang
by
8.2k points

1 Answer

1 vote

Answer: West was an area where people would say was an area of peace and prosperity.

Explanation: The West was not populated at first and there was no government there.

answered
User Wingblade
by
8.4k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.