Cover Story

A craps player must roll an 11 to win his bet. Hoping to do so, he shakes his clasped hands vigorously then fires the two dice at the felt table. Instead, he rolls a four.

On his next turn, the magic number is lower--three. He delicately cradles then gingerly tosses the dice. He gets a seven and loses.

The gambler believes he can control the dice by how he throws them--a hard roll for a big number and a soft roll for a low number--even though the result is pure chance, explains Tom Gilovich, PhD, a psychology professor at Cornell University, who uses the example to show how people irrationally intuit events. "We think a big cause should create a big effect," he says. "People think, 'If I roll the die hard I'm more likely to get a big number.'"

Such logic may seem silly from the outside. Yet we've all felt we could make certain judgments, even if we later acknowledge our decisions lacked an iota of logic. We go with our gut, yet our intuition often fails us, psychologists say. So why do we still rely on it?

Research suggests that intuition, despite its flaws, is integral to our thinking. Some say it helps us to make connections between events to understand a chaotic world, and others suggest it's necessary for us because we must have some immediate apperception of events.

Yet sometimes we are too eager to find connections, so we create them when none exist. And while using that bias at a craps game is relatively harmless, such thinking causes larger problems--from miscalculating quantity to choosing the wrong person for the job, psychologists say. What's worse, it's a tough bias to overcome.

"I've argued that thinking about alternatives helps," says Jonathan Baron, PhD, a psychology professor at the University of Pennsylvania. "But it doesn't seem to help everything."

Desperately seeking something

A basketball player hits five straight shots. "He's on fire!" the announcer exclaims. Yet his next five shots miss. What happened to his hot hand? Nothing--it never existed, says Gilovich, author of "How We Know What Isn't So" (Simon and Schuster, 1993).

People's intuition derives from a desire to find patterns and connections in--and to figure out how to act within--an otherwise random universe. So when a player sinks five straight shots, we think those shots are interconnected and label the player "on fire," Gilovich says, even though those shots are largely independent of one another.

Sometimes people overestimate their intuition and prevent sound decision-making. For example, employers think they can do a better job of predicting employees' future performance through interviews than they can through factors like education and test scores. However, the scores are actually better at predicting someone's performance, finds research using standard statistical regression analyses by Robyn Dawes, PhD, a professor in Carnegie Mellon University's department of social and decision sciences. Yet people still have a desire to interview people for positions because the illusion is so powerful, suggests Richard Nisbett, PhD, of the University of Michigan.

"No unstructured interview for any kind of position--graduate school, medical school, the military or professional jobs--has anything but a low validity for predicting the interviewees future performance," he says. "But we can't contain it. You meet someone and the feeling that you know who that person is after talking to them is just overwhelming."

And sometimes intuition defies logic in our decision-making. Seymour Epstein, PhD, an emeritus psychology professor at the University of Massachusetts Amherst, and his colleagues demonstrated that people often ignore probability when teased by their gut intuition. Participants tried to draw a red jelly bean from jars of either 10 or 100 beans, most of which were white. The 100-bean jar contained seven red beans and the 10-bean jar contained one red bean. Even though the odds of drawing a red bean were better for the 10-bean jar (10 percent) than the 100-bean jar (7 percent), more participants chose to pick from the 100-bean jar. Why? Because the 100-bean jar actually had more red beans, and people's gut intuition indicated they would fare better in that jar because there were more red beans to pick.

The study, in the 1994 Journal of Personality and Social Psychology (Vol. 66, No. 5), found that conflicted participants, though fully aware of their irrational decision, trusted their intuition that more red beans meant better chances to win over their understanding of probability. The finding suggests a dual-mind process, the researchers argue. Their logic told them to pick from the 10-bean jar, and their intuition directed them to the 100-bean jar--but their intuition was wrong in this case.

"If people are being pulled in two directions," Gilovich says, "it suggests there are two minds pulling--one deliberate and one intuitive."

Mechanics of the mind

Gilovich calls these two minds "system one" and "system two." System one is intuitive, rooted in personal experience, imbued with emotion and interested in here-and-now situations. System two draws on logic and deduction, plods slowly through each hypothetical situation, analyzes the odds and selects the most rational decision.

System one's advantage is its quick, survival-based decisions, such as assessing the potential threat from a stranger on a dark street, Gilovich says. Because intuition operates on a gut level, its judgment is compelling. People develop heuristics--mindsets to view the world--using this system.

And that's where we can get into trouble. "Intuition leads us astray because it's not very good at picking up flaws in the evidence," Gilovich says. "It will be faulty when the world conspires against us and presents information that is unrepresentative and misleading."

He notes, for example, that our intuition discriminates against less dramatic and newsworthy events and those that are unrelated to one another. In addition, we disproportionately notice coincidences but fail to notice when no coincidence occurs.

This bias presents larger problems than picking jelly beans. Consider moral behavior, says Baron. He mentions the "do no harm" heuristic: People generally believe that harming someone through action is worse than harming them through inaction, even if the result is the same.

In a 1991 Journal of Experimental Social Psychology (Vol. 27, No. 1) study, Baron told participants about a tennis player who, during dinner the night before a big match, tried to sicken his opponent by knowingly recommending to him a salad dressing that contained an ingredient his opponent was allergic to. In another version, the player merely notices his opponent ordering the dressing but says nothing to warn him. Most of the participants thought the player's behavior was worse in the first example.

The heuristic plays out in real life issues, says Baron, citing euthanasia. He says most people in the United States feel it's better to let somebody die naturally, even if it's a long and painful death. In most states it's illegal to kill a person to ease the pain even when that person wants to die. Without justification, people intuitively favor inaction, he says.

Lost in translation

Our intuition is doubly biased. Just as we think we can intuit others, we think other people notice--and judge--our appearance, behavior and emotions more than they actually do, says Kenneth Savitsky, PhD, a psychology professor at Williams College. Savitsky calls this our "illusion of transparency."

This bias creates communication problems in relationships. In a study of married couples, people assumed their spouses could pick up on hints and detect meaning in their subtle voice inflections--such as saying something sarcastically--more than a stranger could. But participants' spouses could not interpret the meanings any better than strangers could--suggesting that we think our hints, jokes, sarcasm and things gone unsaid are clearer to others than they really are. And the bias makes us nervous in public. In a 2001 Journal of Personality and Social Psychology (Vol. 81, No. 1) study, Savitsky and colleagues asked participants to wear embarrassing T-shirts, then walk into a room of their peers. Participants predicted that half the room would notice the shirts, but less than a quarter did.

"We overestimate how negatively others will judge us because our transgressions loom larger to us than they do to others," Savitsky says. "We think we're the tell-tale heart. But observers have other things to think about besides our social blunders. On some level, we know others aren't as focused on us as we are on ourselves, but it can be easy to lose sight of that idea--especially when we are in the midst of an embarrassing blunder."

So why do we fear the faux pas?

"My best guess is that we've evolved a tendency to be overly concerned with how we appear to others," Savitsky says. "There are costs to that preoccupation, like anxiety, but the opposite error--not being sufficiently attentive to how others judge our transgressions--can be even more costly, possibly leading to social exclusion."

Yet there's hope. In a 2003 Journal of Experimental Social Psychology (Vol. 39, No. 6) study, Savitksy took students who were nervous about public speaking and told them the truth: Their nervousness is less obvious than they think. In the end, observers didn't notice their nervousness, and the students gave better speeches compared with a second group that received no instructions.

"People are sometimes nervous over the possibility that they will look nervous--and the fear that their nervousness is apparent serves only to make them even more nervous," Savitsky says. "By telling them about our previous findings, we cut through that cycle. It's possible to get around these fears. I think that's cause for optimism. Yet I find I still overestimate how much others will notice my behavior, and I do this research. So some portion of it is inescapable."

That's the curse of intuition. Despite our better logic, it's easier to believe we can control the dice than to know it's left to fate.

"We chuckle at that dice rolling, because rationally we know it's not right," Gilovich says. "But it's a sympathetic chuckle, because we think, 'Hey, I'd do that too.'"

Further Reading

  • Dawes, R.M. (2001). Everyday irrationality: How pseudoscientists, lunatics, and the rest of us fail to think rationally. Boulder, CO: Westview Press.

  • Denes-Raj, V., & Epstein, S. (1994). Conflict between intuitive and rational processing: When people behave against their better judgment. Journal of Personality and Social Psychology, 66(5), 819-829.

  • Savitsky, K., Epley, N., & Gilovich, T. (2001). Do others judge us as harshly as we think? Overestimating the impact of our failures, shortcomings, and mishaps. Journal of Personality and Social Psychology, 81(1), 44-56.

  • Savitsky, K., & Gilovich, T. (2003). The illusion of transparency and the alleviation of speech anxiety. Journal of Experimental Social Psychology, 39(6), 618-625.

  • Spranca, M., Minsk, E., & Baron, J. (1991). Omission and commission in judgment and choice. Journal of Experimental Social Psychology, 27(1), 76-105.