asked 60.2k views
5 votes
What did African Americans gain in ww2?

asked
User Juffel
by
8.4k points

2 Answers

7 votes
African Americans gained land but in an unfarmable territory  
answered
User Tsragravorogh
by
8.3k points
2 votes
During the World War II, some of the African American leaders were wary but they still got themselves involved in the war. This involvement did gain them some ground in the civil rights movements. They were also admitted to Navy and Air Force for the first time. 
answered
User Fhchl
by
8.4k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.