asked 220k views
2 votes
How did World War I change the US??

1 Answer

5 votes

Answer:

The war fueled the vast diaspora of African Americans, and those who returned from the war, seeing discrimination still in place, sought civil rights. Furthermore, the war signaled the advent of conscription, mass propaganda, the national security state, and the FBI.

Step-by-step explanation:

answered
User Towkir
by
8.1k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.