Social Psychology. Daniel W. Barrett

Чтение книги онлайн.

Читать онлайн книгу Social Psychology - Daniel W. Barrett страница 36

Social Psychology - Daniel W. Barrett

Скачать книгу

domains of social behavior, including how people evaluate new technologies (Druckman & Bolsen, 2011), count ballots in disputed elections (Kopko, Bryner, Budziak, Devine, & Nawara, 2011), understand important political issues (Slothuus & de Vreese, 2010), interpret consumer brand information (Jain & Maheswaran, 2000), and apply stereotypes (Kunda & Sinclair, 1999). This section introduces a couple of biases that, as you will see, may prevent relatively unbiased thinking and instead lead the perceiver to a preferred conclusion.

      Belief Perseverance: Believing When There Is No Evidence

      A belief is a conviction we hold about whether something is true or false and is often formed on the basis of evidence or information that we accept as true. Sometimes we have a belief that, when originally formed, was based on reasonably good evidence. Take for instance belief in the existence of weapons of mass destruction (WMDs) in Iraq before the 2002 multinational invasion. U.S. political leaders—including President Bush and Secretary of State Colin Powell—argued that Iraq was an imminent and significant threat to Americans because it had substantial stockpiles of WMDs that could be used against the United States or its allies. At the time, ordinary Americans, U.S. Senators and Representatives, and the leaders of the United Nations and many allied nations accepted the data supplied by the U.S. government as valid evidence for this claim. However, after the invasion and intensive efforts to locate material traces of WMDs, it became clear that there were in fact none to be found. Nevertheless, many people—even those who acknowledged the fact that the initial evidence that led them to believe Iraq had WMDs was false—continued to believe that Iraq had them (Lewandowsky, Stritzke, Oberauer, & Morales, 2005).

      In the Iraq case, beliefs about WMDs that were formed on the basis of specific evidence persisted despite the fact that all of the evidence was later demonstrated to be false. This phenomenon of holding onto a belief that been undermined by the facts is called unwarranted belief perseverance (Bui, 2014; Ross, Lepper, & Hubbard, 1975). Belief perseverance occurs when the evidence for a particular belief has been completely discredited, yet the belief continues.

      In one laboratory study, Craig Anderson (1983) provided data to some participants that demonstrated that risk takers made more successful fire fighters than risk avoiders, whereas participants in another condition were supplied with data stating the opposite (that risk takers were less successful fire fighters than risk avoiders). After all participants wrote a short paragraph justifying their beliefs, Anderson told them that the data that they were provided had been completely falsified, and that there was no evidence that risk taking and fire fighting success were in any way correlated. After the evidence for their beliefs had been undermined, participants stated what their true beliefs were about this relationship, as if they had not been exposed to the evidence in this experiment at all. Despite this clear and total undermining of the supposed evidence, participants typically believed that a relationship existed between risk taking and fire fighting success. In other words, their beliefs persevered despite the fact that there was no longer any evidence to support them. These participants seemed to think that their beliefs about this relationship predated the experiment, although this was highly unlikely. Control group participants not exposed to any evidence about this relationship tended to have no strong belief about the relationship between risk taking and success as a firefighter (see Figure 3.4).

      Why do false beliefs persist? In part because we feel pressure to stick with our commitments, including commitments to beliefs, and therefore find it surprisingly difficult to give them up (Cialdini, 2008; Ross et al., 1975). Second, the explanations that the participants had created to justify their belief—perhaps based on a story of a firefighter who took a risk and saved a life that they had learned about in another context—continued to support the participants’ initial beliefs and remained available to the participant after the given explanation was undermined (Anderson, 1983; Anderson, Lepper, & Ross, 1980; Davies, 1997; Nestler, 2010). It did not seem to matter that their self-generated explanations were not based on evidence.

      Can you think of a way to overcome this problem? Take a minute . . . if thinking about an event makes it seem more likely to be true, then is there something else one can think about that might counter it? What about carefully considering how the exact opposite of what you initially believed could instead be true?

      Figure 3.4 Debiasing in the Perseverance of Social Theories

      Source: Adapted from Figure 1, Anderson, C. A. (1982). Inoculation and counterexplanation: Debiasing techniques in the perseverance of social theories. Social Cognition, 1(2), 126–139.

      Motivated Reasoning: Person’s mental processing is influenced by her or his desires, feelings, or goals

      Belief Perseverance: Phenomenon of holding onto a belief when its validity has been undermined by the facts

      Considering the Opposite

      If a complete undermining of the evidence used to create and support a belief failed to shake many participants of their patently false beliefs, then what more can be done to convince them to reject those beliefs? One strategy was examined by Anderson (1982). As described above, Anderson provided participants with evidence about the relationship between taking or avoiding risks and success as a firefighter. Participants exposed to evidence that risk takers were more successful wrote explanations arguing both for and against this claim, and those exposed to the opposite evidence did the same. All were forced to consider how the opposite of what they read could be true. Next they were given a chance to report what their true beliefs about the relationship were before the experiment began. Unlike in the case of simply being told that the evidence for their belief was fabricated, those participants who engaged in considering the opposite did in fact overcome their unsupported beliefs. Imagining how their beliefs could be false largely wiped out the belief perseverance effect described above (Anderson, 1982; Nestler, 2010).

      In sum, sometimes our beliefs are mistaken: They simply are not true. As we’ve seen, there are a number of reasons that we nevertheless hold onto them. One reason is that we may have clung to a belief despite the fact that its evidence is undermined, because we created new reasons for maintaining it (Anderson et al., 1980). A second reason is that we may not have seriously considered alternative beliefs (Anderson, 1982).

      Considering the Opposite: Imagining how one’s beliefs could be false

      Think Again!

      1 What is a belief?

      2 What is belief perseverance, and how can “considering the opposite” overcome it?

      Confirmation Bias

      Imagine you are about to interview another student to determine if she is extraverted. You are given a set of questions to select from, some of which focus on behaviors and experiences indicative of extraversion (such as how would you liven up a party?) and others of introversion (such as in what situations would you like to be more outgoing?). Would you choose questions the answers to which are more likely to produce evidence consistent or inconsistent with the answer you are seeking? Well if you are like the students in Snyder and Swann’s (1978b) study, you’ll favor the consistent items that focus on extraversion. If, on the other hand, you are trying to learn whether she is introverted, you’ll prefer the introversion items. Why?

      n Chapter 1, we described how people are natural hypothesis testers; we develop ideas about how the world works and test our ideas in relevant situations

Скачать книгу