menu
Qamnty
Login
Register
My account
Edit my Profile
Private messages
My favorites
What international role did the United States assume after the war?
Ask a Question
Questions
Unanswered
Tags
Ask a Question
What international role did the United States assume after the war?
asked
Jan 25, 2020
40.6k
views
3
votes
What international role did the United States assume after the war?
History
middle-school
Smi
asked
by
Smi
9.4k
points
answer
comment
share this
share
0 Comments
Please
log in
or
register
to add a comment.
Please
log in
or
register
to answer this question.
1
Answer
4
votes
Germany believed it could win the war before the United States could fully mobilize its military and they no longer had to battle on the Eastern front after Russia withdrew.
HOPE IT HELP ;)
Gustavo Dias
answered
Jan 31, 2020
by
Gustavo Dias
9.1k
points
ask related question
comment
share this
0 Comments
Please
log in
or
register
to add a comment.
← Prev Question
Next Question →
No related questions found
Ask a Question
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.
Categories
All categories
Mathematics
(3.7m)
History
(955k)
English
(903k)
Biology
(716k)
Chemistry
(440k)
Physics
(405k)
Social Studies
(564k)
Advanced Placement
(27.5k)
SAT
(19.1k)
Geography
(146k)
Health
(283k)
Arts
(107k)
Business
(468k)
Computers & Tech
(195k)
French
(33.9k)
German
(4.9k)
Spanish
(174k)
Medicine
(125k)
Law
(53.4k)
Engineering
(74.2k)
Other Questions
is it true or false that after the american revolution conflicts in the northwest territory erupted between remaining british soldiers and native americans
Who made dutch claims in north america?
How did world war 1 affect the racial and ethnic makeup of american cities
Twitter
WhatsApp
Facebook
Reddit
LinkedIn
Email
Link Copied!
Copy
Search Qamnty