asked 165k views
3 votes
Nursing is a Woman's Job
Please discuss


2 Answers

3 votes

Answer:

yes nursing is a women job

answered
User Dalvinder Singh
by
8.1k points
4 votes
Nursing is a woman’s job because they have motherly instincts, they are also kind and caring... most nurses are female, some are guys though... they also care for those

Hope this helped ♥︎
answered
User Sergey Rybalkin
by
7.9k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.