asked 74.2k views
5 votes
What happened in the world after World War II?

A. People started preparing for World War III.
B. Many countries gained new found independence.
C. Japan and the France became the two world's superpowers.
D. World leaders signed an agreement to never go to war again.

asked
User Pbm
by
7.2k points

2 Answers

3 votes
It should be D: world leaders signed an agreement to never go to war again.
answered
User Aprock
by
7.8k points
6 votes
D. World leaders signed an agreement to never go to war again.
answered
User Zaheen
by
8.1k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.