asked 22.2k views
5 votes
Why did americans realize after the war of 1812?

asked
User Badiboy
by
8.4k points

1 Answer

5 votes

Answer:

A sense of nationalism was adopted by Americans as well and unity certainly seemed to be present. Although the country became unified in many trivial ways after the War of 1812, for the most part, the United States actually became more divided.

Step-by-step explanation:

answered
User Gorn
by
7.9k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.