Damned Lies and Statistics. Joel Best

Чтение книги онлайн.

Читать онлайн книгу Damned Lies and Statistics - Joel Best страница 8

Damned Lies and Statistics - Joel Best

Скачать книгу

be “softer” than our knowledge of the physical world. Physicists have far more confidence in their measurements of the atomic weight of mercury than sociologists have in their descriptions of public attitudes toward abortion. This is because there are well-established, generally agreed-upon procedures for measuring atomic weights and because such measurements consistently produce the same results. In contrast, there is less agreement among social scientists about how best to measure—or even how to define—public opinion.

      Although we sometimes treat social statistics as straightforward, hard facts, we ought to ask how those numbers are created. Remember that people promoting social problems want to persuade others, and they use statistics to make their claims more persuasive. Often, the ways people produce statistics are flawed: their numbers may be little more than guesses; or the figures may be a product of poor definitions, flawed measurements, or weak sampling. These are the four basic ways to create bad social statistics.

      GUESSING

      Activists hoping to draw attention to a new social problemoften find that there are no good statistics available.*When a troublesome social condition has been ignored, there usually are no accurate records about the condition to serve as the basis for good statistics. Therefore, when reporters ask activists for facts and figures (“Exactly how big is this problem?”), the activists cannot produce official, authoritative numbers.

      What activists do have is their own sense that the problem is widespread and getting worse. After all, they believe it is an important problem, and they spend much of their time learning more about it and talking to other people who share their concerns. A hothouse atmosphere develops in which everyone agrees this is a big, important problem. People tell one another stories about the problem and, if no one has been keeping careful records, activists soon realize that many cases of the problem—maybe the vast majority—go unreported and leave no records.

      Criminologists use the expression “the dark figure” to refer to the proportion of crimes that don’t appear in crime statistics.1 In theory, citizens report crimes to the police, the police keep records of those reports, and those records become the basis for calculating crime rates. But some crimes are not reported (because people are too afraid or too busy to call the police, or because they doubt the police will be able to do anything useful), and the police may not keep records of all the reports they receive, so the crime rate inevitably underestimates the actual amount of crime. The difference between the number of officially recorded crimes and the true number of crimes is the dark figure.

      Every social problem has a dark figure because some instances (of crime, child abuse, poverty, or whatever) inevitably go unrecorded. How big is the dark figure? When we first learn about a problem that has never before received attention, when no one has any idea how common the problem actually is, we might think of the dark figure as being the entire problem. In other cases where recordkeeping is very thorough, the dark figure may be relatively small (for example, criminologists believe that the vast majority of homicides are recorded, simply because dead bodies usually come to police attention).

      So, when reporters or officials ask activists about the size of a newly created social problem, the activists usually have to guess about the problem’s dark figure. They offer estimates, educated guesses, guesstimates, ballpark figures, or stabs in the dark. When Nightline’s Ted Koppel asked Mitch Snyder, a leading activist for the homeless in the early 1980s, for the source of the estimate that there were two to three million homeless persons, Snyder explained: “Everybody demanded it. Everybody said we want a number…. We got on the phone, we made a lot of calls, we talked to a lot of people, and we said, ‘Okay, here are some numbers.’ They have no meaning, no value.”2 Because activists sincerely believe that the new problem is big and important, and because they suspect that there is a very large dark figure of unreported or unrecorded cases, the activists’ estimates tend to be high, to err on the side of exaggeration. Their guesses are far more likely to overestimate than underestimate a problem’s size. (Activists also favor round numbers. It is remarkable how often their estimates peg the frequency of some social problem at one [or two or more] million cases per year.3)

      Being little more than guesses—and probably guesses that are too high—usually will not discredit activists’ estimates. After all, the media ask activists for estimates precisely because they can’t find more accurate statistics. Reporters want to report facts, activists’ numbers look like facts, and it may be difficult, even impossible to find other numbers, so the media tend to report the activists’ figures. (Scott Adams, the cartoonist who draws Dilbert, explains the process: “Reporters are faced with the daily choice of painstakingly researching stories or writing whatever people tell them. Both approaches pay the same.”4)

      Once a number appears in one news report, that report is a potential source for everyone who becomes interested in the social problem; officials, experts, activists, and other reporters routinely repeat figures that appear in press reports. The number takes on a life of its own, and it goes through “number laundering.5 Its origins as someone’s best guess are now forgotten and, through repetition, it comes to be treated as a straightforward fact—accurate and authoritative. Soon the trail becomes muddy. People lose track of the estimate’s original source, but they assume the number must be correct because it appears everywhere—in news reports, politicians’ speeches, articles in scholarly journals and law reviews, and so on. Over time, as people repeat the number, they may begin to change its meaning, to embellish the statistic.

      Consider early estimates for the crime of stalking.6 Concern about stalking spread very rapidly in the early 1990s; the media publicized the problem, and most state legislatures passed anti-stalking laws. At that time, no official agencies were keeping track of stalking cases, and no studies of the extent of stalking had been done, so there was no way anyone could know how often stalking occurred. After a newsmagazine story reported “researchers suggest that up to 200,000 people exhibit a stalker’s traits,”7 other news reports picked up the “suggested” figure and confidently repeated that there were 200,000 people being stalked. Soon, the media began to improve the statistic. The host of a television talk show declared, “There are an estimated 200,000 stalkers in the United States, and those are only the ones that we have track of.”8 An article in Cosmopolitan warned: “Some two hundred thousand people in the U.S. pursue the famous. No one knows how many people stalk the rest of us, but the figure is probably higher.”9 Thus, the original guess became a foundation for other, even bigger guesses (chapter 3 explores how repeating statistics often alters their meaning).10

      People who create or repeat a statistic often feel they have a stake in defending the number. When someone disputes an estimate and offers a very different (often lower) figure, people may rush to defend the original estimate and attack the new number and anyone who dares to use it. For example, after activists estimated that there were three million homeless in the early 1980s and the Reagan administration countered that the actual number was closer to 300,000, the activists argued that the administration’s figures could not be trusted: after all, the administration was committed to reducing expenditures on social programs and could be expected to minimize the need for additional social services.11 Various social scientists set out to measure the size of the homeless population. When their findings confirmed that the 300,000 figure was more reasonable, the social scientists came under attack from activists who charged that the research had to be flawed, that the researchers’ sympathies must have been with the administration, not the homeless.12 In general, the press continued reporting the large estimates. After all, activists and reporters knew that the actual number of homeless persons was much higher—didn’t everyone agree that three million was the correct figure? This example suggests that any estimate can be defended by challenging the motives of anyone who disputes the figure.

      In addition, the dark figure often plays a prominent part in defending guesses.

Скачать книгу