asked 217k views
0 votes
Explain how Hawaii became a us territory

1 Answer

8 votes

Answer:

Hawaii had become an American territory during 1989 after the Spanish-American War.

Step-by-step explanation:

Along with Manifest Destiny, after the war, the United States had made an unofficial empire in the Pacific. With the Philippines as its biggest colony, Hawaii was much closer to home therefore it was annexed. Hawaii held a geographical strategic value and eventually in the future would turn into an economically prosperous state.

answered
User Aarni Joensuu
by
8.6k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.