asked 38.0k views
2 votes
How do you think World War I changed life in the United States?

asked
User Moiz
by
7.6k points

1 Answer

2 votes

Answer:

German-Americans: the government treat us like enemies, because we only wants neutrality. Our traditions, is changed, ignored.

Jewish Americans: We form numerous organizations numerous organizations to support American trops and raise money for war victims.

Asian americans: If we were soldiers, then we were allowed to become naturalized citizens.

As a consequence, such groups as African Americans, Hispano Americans, and American women became more aggressive in trying to win their full freedoms and civil rights as guaranteed by the Declaration of Independence and US Constitution during the postwar.

Step-by-step explanation:

answered
User Kamta
by
7.6k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.