asked 75.1k views
0 votes
What is one way that Japan changed following World War II?​

asked
User MikaelF
by
7.2k points

1 Answer

5 votes

Japan was defeated in World War 2, which caused the United States to lead the Allies in the occupation and rehabilitation of the Japanese state. Between 1945 and 1952, the U.S. occupying forces enacted widespread military, economic, political and social reforms.

answered
User Genjix
by
7.5k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.