asked 11.3k views
1 vote
After world war 2 how did Americans view the role of the United states

2 Answers

1 vote
the entry of the United States into World War II cause vast changes in virtually most Americans intitle e view their place in the postwar world with optimism
answered
User MPaulo
by
7.0k points
1 vote

Many wanted the U.S. to retreat from global responsibilities.

answered
User Twinmind
by
8.1k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.