Princeton University

Intellectual Humility and the Bias Blind Spot

Emily Pronin



Although the mistake is most pronounced during infancy, even adults sometimes have difficulty separating their subjective perceptions from objective reality. This egocentric error affects people’s basic perceptions (e.g., if a room feels too hot, one thinks it literally is too hot). Importantly, it also affects more complex judgments. For example, if one says that the new healthcare bill is too liberal, that person is likely to see that claim as objectively true rather than as opinion (shaped by self-interest, political ideology, media exposure, etc.). This poses a problem for intellectual humility: If people think their opinions, attitudes, and beliefs are direct responses to objective reality (rather than subjective assessments colored by personal biases), then people are likely to hold those opinions, attitudes, and beliefs with little humility. They are likely to display those beliefs with excessive confidence and to show little interest in the views of those who disagree. In previous work, I have found that people show a “bias blind spot”—a widespread inability to recognize when they have been influenced by bias even while they recognize bias all around them. In this grant, I propose experiments to examine implications of the bias blind spot for intellectual humility. First, I will examine how people respond to cues to the possibility that they may be biased, and whether such cues reduce the bias blind spot and induce intellectual humility. I predict that people’s persistent bias blindness will impede intellectual humility even in the face of signs that they may be biased. I will examine two such cues to bias. One involves peer disagreement. That is, how are people’s perceptions of their own biases impacted by the knowledge that similarly intelligent and informed people disagree? The other cue involves the knowledge of bias in one’s decision-making processes. That is, when people assess the presence of bias in their judgments, can they recognize that bias when they know that their judgments derive from a biased process (e.g., a process that selectively exposed them to one side)? In the second portion of the grant, I will examine a psychological approach to fostering intellectual humility and mitigating the bias bind spot. That approach involves nudging people to think about the role of circumstances (or “situations”) rather than personality factors (or “dispositions) in shaping their and others’ beliefs and actions. One of the most influential lessons of social psychology is that people, and especially Westerners, give too little weight to situational factors in trying to understand and explain themselves and others. I predict that shifting people’s perspectives to make them more situationally-focused (e.g., having them write an essay on the circumstances that got them into college) will make them less convinced of their objectivity, more interested in other perspectives, and more open to the possibility that the real “truth” is between their own views and the views of those who disagree with them. Collectively, these experiments will, I hope, illuminate our understanding of intellectual humility and also provide a path to fostering that humility.