Figure It Out. Stephen P. Anderson
Чтение книги онлайн.
Читать онлайн книгу Figure It Out - Stephen P. Anderson страница 18
“It’s like shaving with the back of a spoon.”
From the poetry of advertising to the well-crafted political speech, a careful choice of words can win hearts and change minds. It can be an explicit association—like shaving with a spoon—or something more suggestive.
Consider what is now a near-ubiquitous phrase in software development: technical debt (or tech debt for short). This metaphor was coined by Ward Cunningham to explain the refactoring work that programmers so often must contend with. It’s one thing to argue that cutting corners—to deliver results more quickly in the near term—will cost more in the long run. It’s quite another thing to link that rational argument with something we all know and feel emotionally: the dangers of going into financial debt. Whether we’re in debt or not, it’s commonly known that debt is a dangerous thing. You shouldn’t knowingly enter into debt unless you have a plan to get out of debt and you’re fully aware of what it will cost you. This phrase technical debt is so powerful, we’ve seen it picked up by other product groups to draw attention to analogous situations: design debt, product debt, conceptual debt.
A very subtle association can certainly change how people think and feel about something. Spoons. Debt. But can these associations change behavior?
Metaphors and Crime: Is Crime a Virus or a Beast?
Researchers Paul H. Thibodeau and Lera Boroditsky have investigated the role of metaphors in reasoning.1 In a study from 2011 (and repeated again in 2015), students were asked to read one of two reports about crime in a city and then suggest solutions for the problem. These reports were nearly identical, citing the same statistical data, except for one small detail: the metaphor used to describe the crime. In the first report, crime was characterized as a “wild beast preying on the city” and “lurking in neighborhoods.” In the second report, crime was described as a “virus infecting the city” and “plaguing” neighborhoods. The researchers wanted to investigate if changing the metaphorical framing would influence whether students chose solutions that were more enforcement-oriented (prison, street patrols) or social-reform-oriented (education, economic reform). Would this beast or virus framing make a difference in the selection of policy responses? While sifting through the data from these studies was complicated and highlighted a broader issue with shifting cultural norms, the research did conclude that framing crime as a beast made people more likely to prefer enforcement-oriented solutions than if crime had been framed as a virus. The research also stated that “people who read that crime was a virus were more likely to endorse the proposal to ‘develop neighborhood watch programs and do more community outreach,’ than people who read that crime was a beast.”
In this case, the metaphors used to describe crime did seem to influence people’s reasoning about crime and the selection of correlative solutions.
Far from being mere rhetorical flourishes, metaphors have profound influences on how we conceptualize and act with respect to important societal issues. We find that exposure to even a single metaphor can induce substantial differences in opinion about how to solve social problems.
—PAUL H. THIBODEAU AND LERA BORODITSKY
Decision Framing and Cognitive Bias
While research into metaphors and linguistic framing is ongoing (at least one study has offered up alternative explanations other than metaphorical framing2), similar studies going back to the 1970s also seem to support this general correlation between things as stated and our reasoning. In studies on “decision framing,” behavioral economists have found that the identical question, asked in different ways, leads to completely different responses.
For example, would you prefer a condom that was “95% effective” or one that had a “5% failure” rate? What about “80% lean” ground beef versus “20% fat” ground beef?3
Rationally, these are identical options. But in repeated studies, most people chose the first option, or the one that either avoided loss or framed things in the positive. The conclusion of behavioral economists such as Amos Tversky and Daniel Kahneman was that the choices we make are influenced by the way these options are framed.
In what is now an oft-cited study from behavioral economics,4 participants were asked to choose between two alternative programs to combat the outbreak of a disease expected to kill 600 people. The first group of respondents was presented with this outbreak situation, with two options to choose from, both framed in terms of lives saved; the second group of respondents was presented with the identical situation and the same two options, but these options were framed in terms of lives lost. What Tversky and Kahneman found was that “choices involving gains were often risk averse and choices involving losses were often risk taking.” While respondents from both groups were asked essentially the same question, answers were nearly reversed between groups. To the extent that framing something triggered a human bias (risk aversion or risk taking in this study), we can conclude that framing—generally—has a powerful effect on the choices we make.
One of the more humorous studies into decision framing came from researchers at the University of Nottingham Centre for Decision Research and Experimental Economics, who posed an interesting question: Would other experimental economists—those professionals who researched these types of topics—be prone to the same framing effects? Using a “natural field experiment” (participants were not aware they were part of a study), researchers used registrations for the 2006 Economic Science Association conference as part of their experiment. As part of the registration process, attendees—the unwitting research participants—were sent an email reminder concerning a fee for late registration. In what is known in web analytics as an A/B test, participants were sent one of two possible emails, each using a different frame. One email warned attendees of a penalty of $50 for registering after the first deadline. The other email communicated a discount of $50 for registering before the first deadline. What were the results? Curiously, 93% of the junior researchers responded to the late fee messaging compared to 67% who responded to the discount messaging.5 These results—that things framed in the negative received more engagement—were consistent with other studies.
Of course, many of these studies beg more investigation. With the conference signup, the study points also showed there was no effect for senior researchers. And with the “disease outbreak” study, critics have pointed out that the phrasing might suggest different—rather than identical—outcomes. That disclaimer aside, the effects of framing are well researched and generally agreed upon. In our view, none of this undercuts the broader point we’re trying to make, that behind all of these studies sits a common thread: how we think and reason about things is based on associations between concepts. Framing, as with metaphors, is an example of this cognitive process in action.
The Economic Engine, Sick Patient, or ...?
Understanding the power of language to shape how we think, it’s no surprise that an economist and author like Paul Krugman might lash out against the metaphors used to characterize the economy. In “Block That Metaphor,” a 2010 Op-Ed piece for The New York Times, Krugman called out phrases like “jump-start the economy” and “a fragile recovery time to strengthen.” From the column:
America’s economy isn’t a stalled car, nor is it an invalid who will soon return to health if he gets a bit more rest. Our problems are longer-term than either metaphor implies.
And bad metaphors make for bad policy. The idea that the economic engine is