asked 182k views
5 votes
Why did the United States finally enter World War II?

Germany invaded Poland.

The Soviet Union refused to pay back debts it owed.

Italy took territory in Ethiopia.

Japan bombed Pearl Harbor.

1 Answer

0 votes
The correct answer to this question is Japan bombed the Pearl Harbor. The time when Japan bombed the Pearl Harbor got the United States to finally enter the war. Although the war has begun with the Germany's invasion of Poland, the United States agreed to stay out of it. However, everything changed on December 7, 1941, a date that every American will never forget.
answered
User Ferhatelmas
by
8.1k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.