asked 146k views
1 vote
How did the united states end up at war with germany in 1941?

asked
User Xpioneer
by
8.5k points

2 Answers

2 votes

Answer:

A.

After the U.S. declared war on Japan, Germany declared war on the U.S.

Step-by-step explanation:

answered
User Omkar Rajam
by
8.6k points
2 votes
Germany declared war on the US after Japan attacked Pearl Harbor as a measure of solidarity
answered
User Steve Peschka
by
7.7k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.