asked 71.7k views
3 votes
Women won the right to vote after America’s victory in ?

A. World War II

B. World War I

C. The Civil war

D. The Spanish-American war

2 Answers

3 votes

Answer:

B. WW1

Step-by-step explanation:

women got the right to vote in 1919 or so. WW1 ended in 1918.

answered
User Bharat Chhabra
by
8.0k points
2 votes

Answer:

B. WWI

Step-by-step explanation:

They won the right to vote after 1919

answered
User Sambecker
by
8.0k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.