asked 120k views
9 votes
PLEASE HELP IM FAILING!!

Following the War of 1812 the United States gained the respect of European nations.



True
False

2 Answers

5 votes
Answer : True Explanation: I’ve taken the test before !
answered
User Andrew Cumming
by
8.5k points
3 votes

Answer:

It's true. The War of 1812 helped gain respect for the United States from the nations of Europe.

answered
User David M Smith
by
7.9k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.