Hypothesis-confirming biases

I hired movers, and they sent over a guy to give me an estimate. He looked over my apartment, which is stuffed to the gills. When the actual movers came, there were 20 more boxes than were estimated, costing $800 more than the estimate (I hired them to pack as well as move the stuff.) I called and complained, and the guy who gave the estimate implicitly accused me of putting much more stuff in the apartment: "I'm never 20 boxes off," he said.

Well he was. My apartment was full of papers and books, as well as a pile of boxes full of... wait for it... papers and books. Someday I need to learn to dispose of a piece of paper. The movers said they kept finding stuff behind stuff, and then finding stuff behind that. The whole thing weighed about 5k pounds. The movers ran out of tape. They'd brought ten rolls, and the last three boxes were not taped on the top. I know one person who will not be the least bit impressed with these numbers-- Anthony, how many pounds of stuff did you end up having to get moved?

This estimate guy is behaving in hypothesis-confirming behavior, I think. He gave this estimate, and when the estimate is off, he would rather believe I added stuff to the apartment than believe that he was far off. Given the information he has, both are possible, but you see what this does-- since he will interpret this episode as one in which he was not off, the next time it happens he will also be likely to disblieve it because, as he said this time "I'm never 20 boxes off." It should be very hard to convince this fellow that his estimates are ever off.

It's a general behavioral tendency in human beings: to look for evidence that supports your hypothesis rather than to search for evidence that disconfirms it. It allows normally smart people to believe weird things, such as the moon effect. The moon effect theory holds that people behave differently (usually more actively or criminally) when there is a full moon. Scientists have done studies of this and found this to be bunk.

This paragraph debunks the common theories behind the moon effect. Skip it if you want. When I hear people argue for the moon effect, they say things like "The moon effects the tides, and we are 98% water, so it follows that the moon should effect us too." There are several things wrong with this viewpoint. First, whether the moon is full or not has nothing to do with how close it is to the earth. Remember that the moon can be "up" even in the daytime; we just don't see it. Second, the fact that we are made of water has nothing to do with anything. The moon affects tides through gravitation, and gravity affects all atoms equally according to weight. That is, it pulls on water no more and no less than any other matter. Third, even if there was a gravitation effect, because gravity reduces so sharply with distance, there is a greater gravitational effect of a skyscraper you're standing next to than the moon. To be consistent these peopel should also believe in the skyscraper effect. Finally, there's no evidence nor plausible reason to think that a gravitational pull on a person should effect how they behave.

Nonetheless, people continue to believe. I've heard from a psychologist who studied the moon effect that if you ask police officers and hospital nurses they will swear it's real. How can this be?

Here's how it works: People hear about the moon effect. Once the idea is in their minds, they notice whenever something strange happens AND there is a full moon. They never notice when people act normally during a full moon, or when people act strangely when there's not a full moon. What's to notice? Only the combination of the two triggers the memory of the moon effect at all. So as you go through life, you believe in the moon effect more and more. The negative evidence just does not get remembered.

I will leave the analogy with religious beliefs as an exercise for the reader.

One of the things that makes science so great is that it works against some of our inherent psychological biases. It requires careful observation and recording, which mediates the hypothesis-confirming bias. Scientists often support a theory by failing to find disconfirming evidence. Awesome!


Dustin said…
I wrote a Semiotics thing on this actually.

Its seems that two statistically unlikely thing are noticed more by the human brain, which is the reason for unfounded prejudice.

Take two rare things, say

- Crime
- A racial minority.

the human brain remembers the occurence of crime combined with a racial minority better than any other combination (e.g. a majority and crime, a minority and a virtue)
This was tested in a study using two different sets of words (I can't remember which - one was rarer) and two different colours (blue and red. red was rarer). These were placed on cards, the words coloured either red or blue, and the subjects perused them.

The distribution of words between colours was even (e.g. the same proportion of one word set to the other word set occured in both red and blue colours) however, since one word set was rarer, and red was rarer, these were strongly associated together by the subjects.

So this points to the cause of unfounded discrimination, psychologically.
Anthony said…
I'll go check the numbers. But the estimate was 12,000 pounds of books, plus an insane amount of other stuff. And then they were overweight. By a *lot*.

-the aforementioned Anthony

Popular Posts