asked 103k views
0 votes
Why did America become isolationist after WW1? HELP!!

1 Answer

1 vote

Answer:

World War I contributed to pushing American public opinion and policy toward isolationism.

Step-by-step explanation:

answered
User Sebastian Zeki
by
8.4k points

Related questions

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.