asked 64.2k views
3 votes
How did Americans feel about the ww1?

1 Answer

7 votes

Answer:

The Americans didn't really care about Europe and it politics so when the Mexicans received the German message to invade the United States, the Americans started hating the Germans. In general ww1 was a war of uselessness and nothing came good out of it because ww2 happened .

Step-by-step explanation:

answered
User Wangdq
by
8.3k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.