Lead Wars. Gerald Markowitz

Чтение книги онлайн.

Читать онлайн книгу Lead Wars - Gerald Markowitz страница 6

Lead Wars - Gerald Markowitz California/Milbank Books on Health and the Public

Скачать книгу

diseases yielded great victories from the 1890s through the 1930s. But with the first decades of the twentieth century, a different view of the profession began to gain ascendancy, redefining the mission of public health in ways that belied its role as an agent of social reform. In 1916 Hibbert Hill, a leading advocate of this new direction, put it this way: “The old public health was concerned with the environment; the new is concerned with the individual. The old sought the sources of infectious disease in the surroundings of man; the new finds them in man himself. The old public health . . . failed because it sought [the sources] . . . in every place and in every thing where they were not.”4 In this view, the idea was for the fast-growing science of biological medicine to concentrate on treating disease person by person rather than on eradicating conditions that facilitated disease and its spread, in some cases encouraging reforms in behavior to reduce individual exposure to harm. Hence, like numerous other fields in the early decades of the century, public health became professionalized, imbuing itself with the aura of science and setting itself off as possessing special expertise.

      By the middle decades of the twentieth century, public health officials thus typically conceived of their field mainly as a laboratory-based scientific enterprise, and many public health professionals saw their work as a technocratic and scientific effort to control the agents that imperiled the public’s health individual by individual.5 We can see this shift in perspective in treating tuberculosis, for example. An infectious disease that terrified the American public in the eighteenth and nineteenth centuries, tuberculosis had begun to decline as a serious threat by the early twentieth century, mainly because of housing reforms, improvements in nutritional standards, and general environmental sanitation. By midcentury, public health officials tended to downplay such environmental conditions and came to rely instead on the armamentarium of new antibiotic therapies to address the relatively small number of tuberculosis victims. The history of responding to industrial accidents and disease offers another example. In the early years of the twentieth century, reformers such as Crystal Eastman addressed the plague of industrial accidents and disease in the steel and coal towns of Pennsylvania by advocating for higher wages, shorter hours, and better working conditions through unionization. By the 1950s, industrial disease and accidents had largely faded from public health view—ironically, in part because the earlier reform efforts had led to protective legislation—and it was left to company physicians to treat individual workers. This turn toward technological and individualistic solutions to problems that had once been defined as societal was by midcentury part of a general shift in American culture away from divisive class politics and toward a faith in ostensibly class-neutral science, technology, and industrial prowess as the best way to address social or public-health-related problems.

      Since the early twentieth century, a tension has existed within the public health field—which mirrors a societal one—between, on the one hand, those who set their sights on prevention of disease and conditions dangerous to health through society-wide efforts and, on the other, those who believe in the more modest and pragmatic goal of ameliorating conditions through piecemeal reforms, personal education, and individual treatments. Despite the tremendous successes of environmental and political efforts to stem epidemics and lower mortality from infectious diseases, the credit for these improvements went to physicians (and the potent drugs they sometimes had at hand), whose role was to treat individuals. This shift also coincidentally, or not so coincidentally, undermined a public health logic that was potentially disruptive to existing social and power relationships between landlord and tenant, worker and industrialist, and poor immigrants and political leaders.

      At elite universities around the country—from Harvard, Yale, and Columbia to Johns Hopkins and Tulane—new schools of public health were established in the first two decades of the twentieth century with funds from the Rockefeller and Carnegie Foundations. Educators at these new schools had faith that science and technology could ameliorate the public health threats that fed broader social conflicts. They envisioned a politically neutral technological and scientific field removed from the politics of reform. The Johns Hopkins School of Hygiene and Public Health was at the center of this movement. William Welch, the school’s founder and first director (as well as the first dean of the university’s medical school), argued persuasively that bacteriology and the laboratory sciences held the key to the future of the field.6 By the mid-twentieth century, municipal public health officials in most cities had adopted this approach. If early in the century public health workers in alliance with social reformers succeeded in getting legislation passed to control child labor and the dangers to health that accompanied it, and to protect women from having to work with such dangerous chemicals as phosphorus and lead, by midcentury departments of health worked more often to reduce exposures of workers to “acceptable” levels that would limit damage rather than eliminate it. Similarly, by the 1970s departments of health had established clinics aimed at treating the person with tuberculosis but displayed little interest in joining with reformers to tear down slums and build better houses for at-risk low-income people.7

      By the 1950s and 1960s, when childhood lead poisoning emerged as a major national issue, public health practitioners were divided between those who defined their roles as identifying victims and treating symptoms and those who in addition sought alliances with social activists to prevent poisoning through housing reforms that would require lead removal. Drawing on the social movements of the 1960s, health professionals joined with antipoverty groups, civil rights organizations, environmentalists, and antiwar activists to struggle for access to health facilities for African Americans in the South and in underserved urban areas, for Chicanos active in the United Farm Workers’ strikes in the grape-growing areas of California and the West, for Native Americans on reservations throughout the country, and for soldiers returning from Vietnam suffering from post-traumatic stress disorders, among others. By the end of the twentieth century, though, the effort to eliminate childhood lead poisoning through improving urban infrastructure had largely been abandoned in favor of reducing exposures.

      CHILDHOOD LEAD POISONING: PUBLIC HEALTH TRIUMPH OR TRAGEDY?

      The campaign to halt childhood lead poisoning is often told as one of the great public health victories, like the efforts to eliminate diphtheria, polio, and other childhood scourges. After all, with the removal of lead from gasoline, blood lead levels of American children between the ages of one and five years declined precipitously from 15 micrograms per deciliter (μg/dl) in 1976–80 to 2.7 μg/dl by 1991–94,8 and levels have continued to drop. Today, the median blood lead level among children aged one–five years is 1.4 μg/dl, and 95 percent of children in this age group have levels below 4.1 μg/dl. Viewed from a broader perspective, however, the story is more complicated, and disturbing, and may constitute what Bruce Lanphear, a leading lead researcher, calls “a pyrrhic victory.”9 If 95 percent of American children have below what is today considered the danger level for lead, then 5 percent—a half million children—still have dangerous amounts of lead in their bodies. A century of knowledge about the harmful effects of lead in the environment and the success of efforts to eliminate some of its sources have not staunched the flood of this toxic material that is polluting our children, our local environments, and our planet.

      FIGURE 1.Rates of lead poisoning, 2003. These rates are based on the CDC’s 2003 level of concern (10 µg/dl). In 2012, the CDC lowered that to 5 µg/dl, increasing the number of at-risk children from approximately 250,000 to nearly half a million. Source: Environmental Health Watch, available at www.gcbl.org/system/files/images/lead_rates_national.jpg.

      

      Today, despite broad understanding of the toxicity of this material, the world mines more lead and uses it in a wider variety of products than ever before. Our handheld electronic devices, the sheathing in our computers, and the batteries in our motor vehicles, even in new “green” cars such as the Prius, depend on it. While in the United States the new uses of lead are to a certain degree circumscribed, the disposal of all our electronic

Скачать книгу