asked 101k views
2 votes
Discuss how you think life in the South was different after the war.

1 Answer

0 votes

When the Civil War ended, the Union was restored and slavery came to end. It also put an end to the plantation class who ruled Southern society. Many farms and building were destroyed when the North invaded the South. Without their slaves, no one was there to work their fields or take of their homes. Many were now left poverty and would take time for them to recover what was lost in the war.

answered
User Vidhi
by
8.2k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.