asked 50.4k views
3 votes
After the United States declared war

on Japan, what nation declared war
on the United States?

asked
User Pozuelog
by
8.7k points

2 Answers

2 votes

Final answer:

Germany and Italy declared war on the United States on December 11, 1941, following the U.S. declaration of war on Japan, officially bringing the U.S. into World War II.

Step-by-step explanation:

After the United States declared war on Japan, following the attack on Pearl Harbor on December 7, 1941, it was Germany and Italy that declared war on the United States. This occurred on December 11, 1941, as a direct consequence of the Tripartite Pact, which bound Japan, Germany, and Italy in a defensive alliance. Germany's and Italy's declarations of war meant that the United States had officially entered into what would become known as World War II, engaged against the Axis powers on multiple fronts.

answered
User JStead
by
8.9k points
1 vote

Answer: Germany and Italy

Explanation: Germany and Italy declared war and that is how World War 2 began

answered
User Lejiend
by
8.5k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.