asked 179k views
1 vote
What is the meaning of this line? Before the war , Americans tended to say "The United States are," but after the war, they said "the United States is."

2 Answers

3 votes

Answer:

it means when they finish a war they are true Americans fighting for there country

answered
User Siddharth
by
7.8k points
2 votes

Answer:

they under estimated the united states before they have seen its potential

answered
User VAndrei
by
7.9k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.