> For example, surgeons are more likely to recommend surgery than non-surgeons. Radiation-oncologists recommend radiation more than other physicians. This is known as specialty bias. Perhaps in an attempt to be transparent, some doctors spontaneously disclose their specialty bias. That is, surgeons may inform their patients that as surgeons, they are biased toward recommending surgery.
> My latest research, published last month in the Proceedings of the National Academy of Sciences, reveals that patients with localized prostate cancer (a condition that has multiple effective treatment options) who heard their surgeon disclose his or her specialty bias were nearly three times more likely to have surgery than those patients who did not hear their surgeon reveal such a bias. Rather than discounting the surgeon’s recommendation, patients reported increased trust in physicians who disclosed their specialty bias.
Every time I think I've figured out the depth of human irrationality, it turns out there's more.
It's actually a pretty good selling tactic: tell them why you think they need surgery but add "I might not be the best person to tell you this because as a surgeon, I look at all problems through surgery"; this makes you come off as honest, putting the patient before your benefit, and because patients don't expect such disclosure, they end up trusting you a lot more.
Dunno, this seems perfectly sensible to me. If a patient doesn't have much other information or expertise to go on, the decision they're making is as much "do I trust this doctor?" as it is "should I have surgery?". And setting aside the latter question, I don't think it's irrational for hearing a disclosure to influence the former.
Also, a doctor aware of biases may be more competent than one that's not. Conspicuously absent from the first post is any comment on the relative value of the doctors.
This instantly reminded me of an article I read about medication side effect disclaimers actually increasing the odds of people trying the product. Both are pretty interesting effects.
> We found no evidence that consumers benefit from government-mandated disclaimers in advertising. Experiments and
common experience show that admonishments to change or avoid behaviors often have effects opposite to those intended.
We found 18 experimental studiesthat provided evidence relevant to mandatory disclaimers. Mandated messagesincreased
confusion in all, and were ineffective or harmful in the 15 studiesthat examined perceptions, attitudes, or decisions. We
conducted an experiment on the effects of a government-mandated disclaimer for a Florida court case. Two advertisements
for dentists offering implant dentistry were shown to 317 subjects. One advertiser had implant dentistry credentials. Subjects
exposed to the disclaimer more often recommended the advertiser who lacked credentials. Women and less-educated subjects
were particularly prone to this error. In addition, subjects drew false and damaging inferences about the credentialed dentist.
I wonder if this extends to sharing your reasoning for a decision with others. It seems like it might be a sort of social cognitive offload, where we feel more sure with our decisions when we expose our reasoning to other, even unqualified, people. I think hearing no rebuttal gives us assurance, much like someone positively encouraging you to do something does, even if the encourager has no idea the difficulty of the task,your qualifications, etc.
This seems especially relevant for bias disclosure, because now we feel better giving in to those biases because we've issued warnings.
Anecdotally, I see this used in social situations where someone is not particularly nice, who then unapologetically informs everyone that they are "not nice" or "mean" as if it excuses the behavior.
Biases are blaming statements. Saying someone is "not nice" is a blaming statement against a person directly...to avoid the disruption caused by the words which elicited the statement to begin with. Blaming someone else's character to diminish the importance of the statement they made is a well known and oft wielded bias.
Biases work because they allow a steady state of cognitive dissonance to occur. It's only when dissonance is disturbed that it begins to chew up resources - both internally and externally. How much effort should be going into picking a good president here in the US and what is limiting our ability to change things for the better? That's a HARD question to answer and requires a lot of work from a lot of different people to get even reasonably close to breaking the dissonance in society for the better.
Seems a far more efficient solution to let things randomly break at a point for a nice, yet messy, reset on societal cognition.
In business, the standard of conflict-of-interest is "Did You Disclose"? So LendingClub's CEO got in trouble for not disclosing the conflict of interest. Achieving financial benefit on the side wasn't the issue - lack of disclosure was.
This actually creates worse governance issues, because you can't trust that execs will do the right thing. You can only trust that they'll tell the board, who may have conflicts of their own.
It seems like the paradox very much involves the problem of a person disclose their bias. This results in all sort of paradoxical problems of someone wanting maintain social relations, someone taking a disclosure as part of social signaling in general, etc.
A relatively simple way around this, I could suggest, is having someone else make bias disclosures and indeed requiring someone else to make those disclosures.
> My latest research, published last month in the Proceedings of the National Academy of Sciences, reveals that patients with localized prostate cancer (a condition that has multiple effective treatment options) who heard their surgeon disclose his or her specialty bias were nearly three times more likely to have surgery than those patients who did not hear their surgeon reveal such a bias. Rather than discounting the surgeon’s recommendation, patients reported increased trust in physicians who disclosed their specialty bias.
Every time I think I've figured out the depth of human irrationality, it turns out there's more.