asked 76.8k views
24 votes
True or False?

The United States and Japan were still enemies AFTER World War II?

asked
User Saad Ali
by
8.5k points

1 Answer

9 votes

Answer: False, post war world II the nations forged a strong alliance.

Step-by-step explanation:

answered
User EgyEast
by
8.3k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.