asked 182k views
1 vote
After WWI African Americans how did they get treated

asked
User JSWilson
by
8.1k points

1 Answer

2 votes
Black people emerged from the war bloodied and scarred. Nevertheless, the war marked a turning point in their struggles for freedom and equal rights that would continue throughout the 20th century and into the 21st.
answered
User Frbl
by
7.8k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.