Doctor:: "So you want to go into a health profession?" Me:: "I think so." Doctor:: "Ohh, so you want to be a nurse. Okay." Made me feel annoyed, shrunken, marginalized.
Does that doctor only think that women can be nurses? That is hurtful.
It was an example of what people say and think about women. I hate when men say these things.
That sounds unfair someone would say that. I believe the only thing that matters is whether you can do the job right, not the gender. If a woman do the job and wants to do it, she she should be able to.