asked 108k views
4 votes
More women are graduating from college and snagging more knowledge-based jobs.

2 Answers

7 votes

Answer:

Is that a question or a opinion?

answered
User Modern Labs
by
9.3k points
10 votes
Yes, women in the U.S. have earned more degrees for decades.
answered
User VasFou
by
7.9k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.