So you go to see your doctor because you have been sick for a week and you want some antibiotics so you can finally get better. Or you have a cough and are concerned its more than just a cough and think the doctor should give you a chest x-ray to make sure its nothing more. You get the point - you go to the doctor with an expectation of care you should receive.
But the doctor doesn't agree and says what you want is unneeded. Your cold is a cold and will get better, antibiotics will do nothing. Your cough is a cough and you don't need an x-ray. Should the doctor agree with you and give you the antibiotics or x-ray or should they just tell you the truth? I want the truth.
I don't want a doctor to sugarcoat anything (especially when they say 'you may feel a pinch' - I want them to say 'it will hurt a lot and grit your teeth'), I want them to tell me the truth. I don't like doctors who try to paint a fluffy pastel colored picture of my health. Give me the details and the numbers and I'll suck it up and cope with it.
Maybe other people are different, they don't want the truth in a big pile of information, they want little bits over time or just don't want the details, they just want a cure for whatever they had.
I like my breast surgeon, he is good for telling it like he sees it. He has said things to me in a very open style that tells me the truth. Some people don't like him because he tells things very plainly but I appreciate it.
I have also learned that just because I make assumptions about what I want for care, the doctors are the ones that have the training to make the decision on what is really needed. I let them make the decisions and tell me why I need what they suggest. But then I do make sure I agree with them. Its my body after all.