asked 52.7k views
3 votes
How might those attitudes have changed as the war went on for several years?

im in 8th grade

1 Answer

3 votes

Answer:

Step-by-step explanation:

World War I caused Americans to be more isolationist and pacifist. WWI soured Americans on foreign affairs. They felt that they had been pulled into a war that was not really important to US interests. Therefore, they hoped to remain isolated from foreign affairs, except when they took actions to try to prevent war.

answered
User Jin Kwon
by
8.4k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.