asked 59.6k views
25 votes
What is your opinion about Hawaii being annexed by the United States? Explain.

asked
User TheBook
by
8.0k points

2 Answers

2 votes

Answer:

During the time of annexation, the American Imperial policy was to expand throughout the pacific. As much as imperialism is bad, Hawaii was a strategic territory during WW2 and would've easily fallen into Japanese hands if the US did not annex it. Now Hawaii serves as one of the most economically prosperous states and one of the most visited. So as bad as it sounds, the US was in the right to annex the Hawaiian islands and benefitted both the country and state in the long run.

answered
User Dogsgod
by
7.3k points
8 votes
I don’t think Hawaii should be annexed by the United sates because Hawaii is a tropical place and is a get away for us citizens. Many of our natural resources come from the untied states and in result
answered
User Tao Zhyn
by
7.6k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.