asked 181k views
0 votes
What did the US to with Japan after WW2?

1 Answer

4 votes

Answer: After the defeat of Japan in World War II, the United States led the Allies in the occupation and rehabilitation of the Japanese state.

Explanation: hope this helps

answered
User Ryan Searle
by
8.2k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.