Surface    |    Backfill    |    About    |    Contact


Biased Reception Of Corrections

There has been some discussion recently of studies showing the human resistance to contradictory information. Generally, people confronted with information that contradicts their beliefs search for ways to discount the new information, since we have so much invested in our prior beliefs but little invested in the new information -- a procedure that may end up *confirming* our preexisting beliefs. This is not necessarily as irrational as it sounds -- as readers of reports on the experiments, we know that the contradictory information is true and reliable, but in real life we don't (nor do we as subjects in a psychology experiment, given what we know about how psychology experiments work!). We'd be in trouble if every time a piece of new information came along we discarded our previous beliefs (especially since it's hard to keep an accurate catalog of the bits of information that went into creating those beliefs, rather than just remembering the conclusions we were brought to). Nevertheless, this tendency can certainly be pathological much of the time. At the very least, it's problematic that people systematically underestimate how biased they are, even when making vocal declarations of humility.

What aroused especial interest in one recent study (pdf) was the claim that conservatives are more susceptible to this bias than liberals. But I think there's room to be cautious about this result based on the design of the instrument. If we're testing contradictions of people's political beliefs, liberals and conservatives have to be exposed to different contradictions. These contradictions have to be chosen by someone -- the researcher. But researchers, being people, are presumably subject to the same biases as the subjects. There's no objective way to rate the contradictoriness of statements, nor is there a comprehensive database of mendacity that can be randomly sampled. That means that it's difficult to trust that the researcher would be able to select equally clear, and equally deep-cutting, pieces of contradictory information for both liberals and conservatives. The problem is greater when fewer contradictions are used -- e.g. the study linked above examined only three, rebuttals to "Iraq had WMDs," "tax cust increase revenues," and "Bush banned stem cell research." A liberal researcher might, thus, tend to select minor hypocrises by liberal politicians and major ones by conservatives as a way of protecting their own beliefs. Or they might overcompensate for their own bias by picking worse liberal examples. Or they may simply misunderstand what will come off as a serious versus minor contradiction within the conservative worldview, since they don't share its premises. And all of this applies mutatis mutandis for conservative researchers. Meanwhile, readers would have difficulty verifying the comparability of the contradictions, being human beings with biases too. (Note that the study is far more cautious about the liberals-vs-conservatives claim than the news report linked in the first paragraph.)

All of these concerns apply just to the question of whether conservatives or liberals are more subject to confirmation bias. That the studies demonstrate the existence of this bias in both parties, and can elucidate its mechanisms, and can relate it to other variables (gender, age, etc.) or psychological processes is not in question. But to have confidence in a finding of liberal-conservative difference, we would need a large difference shown in multiple high-quality studies of a variety of different contradictions by researchers of different political persuasions.

Then again, maybe I'm just biased against declarations of conservatives' inherent stupidity.


Post a Comment

Subscribe to Post Comments [Atom]

<< Home