asked 19.3k views
1 vote
In which part of the United States did women first begin to get the right to vote?

A: Eastern Coastal Towns

B: Northern Industrial Cities

C: The South

D: The West

asked
User Bucky
by
8.2k points

1 Answer

6 votes
I think it was the South. Not exactly sure though.
answered
User Hubert
by
8.7k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.