asked 42.3k views
10 votes
WWI showcased he dark side of what?

1 Answer

9 votes

Answer:

World War I was the deadliest conflict until that point in human history, claiming tens of millions of casualties on all sides.

The experience of World War I had a major impact on US domestic politics, culture, and society. Women achieved the right to vote, while other groups of American citizens were subject to systematic repression.

answered
User Dlu
by
7.7k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.