Tuesday, April 18, 2006

I'm O.K., You're Biased - New York Times

"Research suggests that decision-makers don't realize just how easily and often their objectivity is compromised. The human brain knows many tricks that allow it to consider evidence, weigh facts and still reach precisely the conclusion it favors.

When our bathroom scale delivers bad news, we hop off and then on again, just to make sure we didn't misread the display or put too much pressure on one foot. When our scale delivers good news, we smile and head for the shower. By uncritically accepting evidence when it pleases us, and insisting on more when it doesn't, we subtly tip the scales in our favor.

... researchers asked subjects to evaluate a student's intelligence by examining information about him one piece at a time. The information was quite damning, and subjects were told they could stop examining it as soon as they'd reached a firm conclusion. Results showed that when subjects liked the student they were evaluating, they turned over one card after another, searching for the one piece of information that might allow them to say something nice about him. But when they disliked the student, they turned over a few cards, shrugged and called it a day.

Much of what happens in the brain is not evident to the brain itself, and thus people are better at playing these sorts of tricks on themselves than at catching themselves in the act. People realize that humans deceive themselves, of course, but they don't seem to realize that they too are human.

A Princeton University research team asked people to estimate how susceptible they and "the average person" were to a long list of judgmental biases; the majority of people claimed to be less biased than the majority of people. A 2001 study of medical residents found that 84 percent thought that their colleagues were influenced by gifts from pharmaceutical companies, but only 16 percent thought that they were similarly influenced. Dozens of studies have shown that when people try to overcome their judgmental biases — for example, when they are given information and told not to let it influence their judgment — they simply can't comply, even when money is at stake.

... two psychologists, Dale Miller and Rebecca Ratner, asked people to predict how many others would agree to give blood for free or for $15, and people predicted that the monetary incentive would double the rate of blood donation. But when the researchers actually asked people to give blood, they found they were just as willing to do it for nothing as they were for a $15 reward.

The same researchers measured people's attitudes toward smoking bans and asked them to guess the attitudes of others. They found that smokers vastly overestimated the support of nonsmokers for the bans, as did nonsmokers the opposition of smokers to the bans — in other words, neither group was quite as self-interested as the other group believed.

Behavioral economics bolsters psychology's case. When subjects play laboratory games that allow them to walk away with cash, self-interest dictates that they should get all the cash they can carry. But scores of experiments show that subjects are willing to forgo cash in order to play nice.

For instance, when subjects are given a sum of money and told that they can split it with an unseen stranger in any proportion they like, they typically give the stranger a third or more, even though they could just as easily have given him nothing. When subjects play the opposite role and are made the recipients of such splits, they typically refuse any split they consider grossly unfair, preferring to walk away with nothing than to accept an unjust distribution.

In a recent study, the economists Ernst Fehr and Simon Gächter had subjects play a game in which members of a team could earn money when everyone pitched in. They found that subjects were willing to spend their money just to make sure freeloaders on the team didn't earn any. Studies such as these suggest that people act in their own interests, but that their interests include ideals of fairness, prudence and generosity.

In short, doctors, judges, consultants and vice presidents strive for truth more often than we realize, and miss that mark more often than they realize. Because the brain cannot see itself fooling itself, the only reliable method for avoiding bias is to avoid the situations that produce it.


When doctors refuse to accept gifts from those who supply drugs to their patients, when justices refuse to hear cases involving those with whom they share familial ties and when chief executives refuse to let their compensation be determined by those beholden to them, then everyone sleeps well.

Until then, behavioral scientists have plenty to study."

0 Comments:

Post a Comment

<< Home