asked 186k views
1 vote
After the american war the united states emerged as a __ __ after the victory over Spain

1 Answer

2 votes

Answer:

The United States emerged as a world power as a result of victory over Spain in the Spanish American War. The United States gained possession of the Philippines, Guam, and Puerto Rico.

Step-by-step explanation:

answered
User Ambrosia
by
7.2k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.