asked 113k views
5 votes
What does the word "continental" mean when describing the U.S. states

2 Answers

4 votes

it refers to the ones that are all on one continent, north america

answered
User Tim Dams
by
7.5k points
3 votes

the term continental U.S. refers to any state on the North American continent

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.