Surface    |    Backfill    |    About    |    Contact


18.1.08

Utilitarians And Brain Damage

Going through some old blog carnivals, I came across a post by Wesley Buckwalter provocatively titled "Are Utilitarians Brain-Damaged?" The post describes an interesting experiment* in which normal people and people with damage to the ventromedial prefrontal cortex (known to play an important role in emotion) in their brain were both asked to evaluate various moral dilemmas. Both groups tended to make utilitarian judgments about moral dilemmas that were impersonal (involving strangers and actions separated from the decider by some chain of cause-and-effect). But when faced with personal moral dilemmas (which required direct actions on known individuals), only the brain-damaged people continued to make utilitarian judgments. (Note that in studies like these, there is typically a good amount of variability between people -- so there were probably plenty of "normal" people who made utilitarian judgments in all cases.)

This study confirms some interesting things about how our brains work. But I'm not sure it says as much about how we should make moral judgments as Buckwalter suggests. He concludes his post:

If it is true that we have experimental proof that in certain circumstances intuitions and emotions are necessary to bring about normal moral judgments, what are the implications for popular consequentialist theories, which, it could be argued, sometimes rely on the absence of emotion in the decision making process? Further, if it is the case that a particular part of the pre-frontal cortex is responsible for the emotions that in some cases give rise to moral judgment, can Utilitarianism account for the fact that it may require something of its agents that is just contrary to their mental
architecture?


That the demands of utilitarianism often run contrary to our intuitions about particular cases is nothing new. Bentham and Mill were, after all, social reformers. And indeed, what's the point of a moral philosophy that never tells us we ought to do something we didn't already think we should do (or more precisely, that sets out to make sure it never asks us to do such a thing)? In any event, since the study showed both kinds of people making utilitarian judgments in some cases, it's equally challenging to consistent anti-utilitarians. This study might have raised problems if it had shown that the normal patients couldn't reason, or act, in a consistently more utilitarian way than their intuitions led them to. But it showed no
such thing, and in fact we have good reason to believe that people can alter or recallibrate their intuitive judgments.

Moral psychology is an important pursuit, but interpreting its results requires caution about the is/ought divide. After all, if someone did a study showing that nomal people are susceptible to the Gambler's Fallacy but people with damage to a certain part of the brain are not, we would not take that as evidence that the Gambler's Fallacy is correct after all and statisticians are brain-damaged, nor that trying to teach people to be more statistically literate is a fool's errand.

I would note as well that it's questionable to take this study to demonstrate that people with damage to the ventromedial prefrontal cortex necessarily always reason in a utilitarian manner (nor that normal people always reason the specific way the control group did). After all, if you are entirely devoid of emotion, why would you care about saving the greater number of people or causing others less harm? I think a more culturally sensitive interpretation of the results is that they confirm the dual-process theory of judgment (that we have a quick intuitive, affect-based process and a slower methodical, cognition-based process) and that the ventromedial prefrontal cortex is critical in applying the quick process. The specific content of
those processes, however, may be (to a greater or lesser extent) culturally variable, so it's only in certain cultures -- such as the dominant culture of the modern West, where something like utilitarianism is deeply engrained as the very definition of
rationality -- that the slow process would produce specifically utilitarian results.

*I don't have access to the original paper at the moment, so all of my discussion is working from Buckwalter's summary.

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home