asked 196k views
5 votes
How did the United States impact Japan after WW2 ended?

1 Answer

3 votes
After the defeat of Japan in World War II, the United States led the Allies in the occupation and rehabilitation of the Japanese state.
answered
User Nikola
by
7.5k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.