asked 76.8k views
24 votes
True or False?

The United States and Japan were still enemies AFTER World War II?

asked
User Saad Ali
by
8.5k points

1 Answer

9 votes

Answer: False, post war world II the nations forged a strong alliance.

Step-by-step explanation:

answered
User EgyEast
by
8.3k points

No related questions found