asked 52.1k views
1 vote
How did the United States end up at war with Germany in 1941? A. The United States declared war on Germany after the London blitz. B. After America came to Britain's aid with the "lend-lease" program, Germany declared war on the U.S. C. After the U.S. declared war on Japan, Germany declared war on the U.S. D. The U.S. declared war on Germany and Italy at the same time it declared war on Japan.

asked
User Nemus
by
7.9k points

2 Answers

4 votes

Answer: the answer is C

Step-by-step explanation:

Because i just did the quiz and got it right

answered
User Mario Fusco
by
8.6k points
2 votes

Answer:

C. After the U.S. declared war on Japan, Germany declared war on the U.S.

Explanation:

After the Japanese attack on Pearl Harbor and the United States' declaration of war against the Japanese Empire, Nazi Germany declared war against the United States, in response to what was claimed to be a series of provocations by the United States government.

answered
User PaulDong
by
8.3k points