Is gender-affirming care the same thing as gender-affirming therapy? I’m sick of both phrases but it seems like gender-affirming therapy obviously involves a therapist. I thought the job of a therapist is to get at the true cause of whatever is keeping the patient from moving forward in life, not to give patients a pat on the head and affirm whatever self-diagnosis they present with.
And I’ve always taken gender-affirming care to mean that a doctor is involved. When I visit a doctor, I expect the doctor to gather the evidence and provide a diagnosis. I don’t expect to have my intestines removed because I “feel” like I have colon cancer.
What does affirmation have to do with the job of a doctor?