Do We Give Doctors Too Much Respect?
This may be a radical train of thought but I think maybe we give too much respect to doctors. Yes of course they studied hard and have vast knowledge and/or experience in their field of expertise but that's the thing...gender and sexuality usually isn't their field of expertise and yet when we go for an appointment we're supposed to treat them like an authority figure who mustn't be spoken back to.
I once had a doctor who said something about 'the queers' and I just sat there and took it, even though I know in the real world I'd have spoken back to whoever had said that to me.
I'm not suggesting starting an argument or getting angry in that moment, lots of the language used is plain ignorance. But I would like it if we stopped treating doctors like gods and perhaps teach them things about the world while we're in their office. They can learn about how our community lives while we can learn about our medical issues, it's a win win!
Comments