asked 171k views
2 votes
All of these countries had official colonies in Africa at the end of World War II EXCEPT

1 Answer

4 votes
All of these countries had official colonies in Africa at the end of World War II except for Germany.
answered
User Mlovic
by
8.6k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.