asked 61.5k views
1 vote
What event finally brings America to WWII?

1 Answer

2 votes
The great depression (because of the big stock market crash) is what brings america to WWII
answered
User Crazypeter
by
8.7k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.