asked 144k views
0 votes
What country brought the United States into World War 2?

Germany
Soviet Union (formerly known as Russia)
France
Japan

1 Answer

4 votes

Final answer:

Japan brought the United States into World War 2.


Step-by-step explanation:

The country that brought the United States into World War 2 was Japan. The attack on Pearl Harbor on December 7, 1941, prompted the United States to declare war on Japan the following day, which then led to the US involvement in the war.


Learn more about United States' involvement in World War 2

answered
User Spiralx
by
8.3k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.