asked 131k views
2 votes
How did the Vietnam war change American foreign policy?

1 Answer

1 vote
World War II, but left Vietnam with a humiliating defeat, shockingly high casualties, American public sharply divided and its leaders uncertain of what lay ahead in foreign policy
answered
User Petros Kyriakou
by
8.3k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.