asked 9.7k views
2 votes
Which event finally brought the United States into World War II?

a. Japan's attack on Pearl Harbor
b. Germany's invasion of France

c. Britain's attack on Gibraltar

d. Italy's invasion of Greece

1 Answer

3 votes
It was "a. Japan's attack on Pearl Harbor" that finally brought the United States into World War II, since this was a direct attack on a United States military establishment, which was an indisputable act of war. 
answered
User Liamf
by
8.2k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.