asked 137k views
0 votes
What does "American Imperialism" mean?

1 Answer

6 votes
American imperialism is the economic, military, and cultural influence of the united states internationally.

Hope this will help you

~~~~~~~~~~~~~~~~~~~~~~Sora Keyblader~~~~~~~~~~~~~~~~~~~~~




answered
User BMBM
by
8.6k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.