asked 92.9k views
0 votes
Ow did the US change the government of Japan after World War II?

asked
User CPI
by
7.9k points

2 Answers

1 vote

Answer:

After Japan surrendered in 1945, ending World War II, Allied forces led by the United States occupied the nation, bringing drastic changes. Japan was disarmed, its empire dissolved, its form of government changed to a democracy, and its economy and education system reorganized and rebuilt.

answered
User Justin Lange
by
8.1k points
3 votes

It created a democratic government.

answered
User Yuan He
by
7.8k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.