asked 132k views
3 votes
What does the term “Southern Redemption” mean?

Southerners regained control of their states after Reconstruction.

Southerners were forced to pay the costs of the Civil War.

The South was freed from the institution of slavery.

The South became a fairer, more equal society.

asked
User Raymund
by
8.0k points

2 Answers

0 votes

Answer:

A

Step-by-step explanation:

answered
User Musthero
by
7.6k points
5 votes

Answer: a: Southerners regained control of their states after Reconstruction.

Step-by-step explanation:

answered
User Jared S
by
8.1k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.