Communicating in Risk, Crisis, and High Stress Situations: Evidence-Based Strategies and Practice. Vincent T. Covello
Чтение книги онлайн.
Читать онлайн книгу Communicating in Risk, Crisis, and High Stress Situations: Evidence-Based Strategies and Practice - Vincent T. Covello страница 30
Figure 3.1 Strategies for overcoming false information
Source: International Federation of Library Associations and Institutions (2021). “How to Spot Fake News.” Accessed at: https://www.ifla.org/publications/node/11174.
3.2.6 Characteristics and Limitations of People in their Ability to Evaluate and Interpret Risk Information
As shown in Table 3.5 and described below, at least 13 factors interfere with the ability of people to evaluate and interpret risk information. Individually and collectively, these factors often lead to uniformed decisions. One of the challenges faced by risk managers and communicators is to identify and address the specific interference factor or factors affecting informed decision‐making.
1 Inaccurate perceptions of risk: People often overestimate some risks and underestimate others. For example, people often overestimate the risks of dramatic or sensational causes of death, such as accidents at manufacturing plants, but underestimate the risks of less dramatic causes of death, such as asthma, emphysema, and diabetes. Adverse consequences of risk overestimation include dysfunctional behaviors, stress, anxiety, dread, confusion, hopelessness, helplessness, and misallocation of risk‐reduction resources. Adverse consequences of risk underestimation include apathy and denial, which can lead to the failure to take appropriate actions. Overestimation or underestimation of risks is caused in part by the tendency for risk judgments to be influenced by the memorability of past events and by the ability to imagine future events. A recent disaster, intense media coverage, or a vivid film can heighten the perception of risk. Conversely, risks that are not memorable, obvious, palpable, tangible, or immediate are underestimated.
2 Difficulties understanding statistical or complex scientific information related to unfamiliar activities or technologies: A variety of cognitive biases and related factors hamper people’s understanding of probabilities. This difficulty hampers discussions about risks between experts and nonexperts. For example, risk experts are often confused by the public’s rejection of the argument that a risk from a new activity or technology is acceptable if the risk is smaller than the ones people face in their daily lives.
3 Personalization: People often personalize the risk. Sample question: What if I am the person who is harmed or adversely impacted?
4 Trustworthiness: People often raise questions of trust. Sample question: Why should I believe you on this issue given you previously made mistakes or changed your mind about risks and threats?
5 Cumulative Risks: People often raise concerns about cumulative risks. Sample question: I already have enough risks in my life. Why should I take on even one more?
6 Benefits: People often question whether the risks are worth the benefits. Sample question: Will the benefits of the new activity or technology significantly outweigh the risks?
7 Ethics: People often raise ethical questions. Sample question: Who gave you the right to make decisions that violate the moral principles and rights of others? Complicating the perception of fairness is the difficulty people have understanding, appreciating, and interpreting small probabilities, such as the difference between 1 chance in 100,000 and 1 chance in 1,000,000. These same problems hamper discussions between technical experts and nonexperts about what is remotely possible and probable. Given these difficulties, to be effective, risk communication strategies must address the experiences, attitudes, beliefs, values, and culture of those receiving information about a risk or threat. Effective risk communication skills are built on a foundation of understanding how people perceive risks.
8 Strong emotional responses to risk information: Strong feelings of fear, worry, anger, outrage, and helplessness are often evoked by exposure to unwanted or dreaded risks. These emotions often make it difficult for leaders, risk managers, and technical experts to engage in constructive discussions about risks in public settings. Emotions are most intense when people perceive the risk to be involuntary, unfair, not under their personal control, managed by untrustworthy individuals or organizations, and offering few benefits. More extreme emotional reactions often occur when the risk affects children, when the adverse consequences are particularly dreaded, and when worst‐case scenarios are imagined. Strong emotional responses to risk information are not necessarily wrong or contrary to knowledge. They can be based on practical or experiential knowledge and emphasize what people value, such as fairness and equity. Strong emotional responses are also not necessarily opposed to reason. That rationality and emotions are opposed to each other derives from the belief that the human brain perceives reality in two distinct ways: one is emotional, instinctive, intuitive, spontaneous, while the other is rational, analytical, statistical, and occurred later in human evolution. However, strong emotional responses to risk information can be rational and prevent people from engaging in dangerous activities.
9 Desires and demands for scientific certainty: People often display a marked aversion to uncertainty. They use a variety of coping mechanisms to reduce the anxiety generated by uncertainty. This aversion frequently translates into a clear preference for statements of fact over statements of probability, which is the language of risk assessment. People often demand that technical experts tell them exactly what will happen, not what might happen. For example, the changing recommendations during the COVID‐19 pandemic on things such as whether face masks were effective or whether a person without symptoms could spread the disease caused many people to become frustrated and caused them to distrust science.
10 Strong beliefs that resist change: People tend to seek out information that confirms and supports their beliefs and often ignore evidence that contradicts their beliefs. Beliefs often operate on a polarized scale of True or False, with little gray in‐between. Opinions often operate on a different scale – Favorable or Unfavorable. According to the Four Hit Theory of Belief Formation, once formed, a belief is difficult or impossible to change. On average, four unanswered risk communication messages (hits) from trustworthy sources can crystalize into a belief. Less than four risk communication messages (hits) are typically still an opinion. A hit from one side can be negated by a hit from the other side.Strong beliefs about risks or threats, once formed, change slowly and are extraordinarily persistent even in the face of contrary evidence. Initial beliefs about risks structure the way subsequent evidence is interpreted. Fresh evidence – e.g., data provided by a technical expert – appears reliable and informative only if it is consistent with the initial belief; contrary evidence is dismissed as unreliable, erroneous, irrelevant, or unrepresentative.
11 Opinions can be manipulated by how information is presented: When people lack strong prior beliefs, subtle changes in the way risk information is presented and framed can have a major impact on opinions. For example, two groups of physicians were asked to choose between two therapies – surgery or radiotherapy.5 Each group received the same information; however, the probabilities were expressed either in terms of dying or surviving. Both numbers expressed the same probability, but the different presentations resulted in dramatic variation in the choice of therapy. Here, physicians received the survival data better. However, the effects of information framing are modified by factors, including risk aversion, experience, beliefs, level of risk, type of risk, or costs of risk mitigation.
12 Ignoring or dismissing risk information because of its perceived lack of personal relevance: Risk data often relates to society.