asked 12.0k views
0 votes
Which country has influenced foreign policy in the United States the most since the end of World War II?

Japan
Russia
France
Britain

1 Answer

4 votes
Japan :))))))))))))))))))
answered
User Wojo
by
7.3k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.