asked 105k views
3 votes
What even brought the United States into World War II? A. An attack on

asked
User Asia
by
8.8k points

2 Answers

2 votes

Answer:

USA did't want anything to do with World War II until Japan attacked the Pearl Harbor on December 7th, 1941 which brought the USA into world war II!

Step-by-step explanation:

answered
User Tkrishtop
by
9.4k points
4 votes

Answer:

The attack on Pearl Harbor.

answered
User Iya
by
8.5k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.