Military Waste. Joshua O. Reno

Чтение книги онлайн.

Читать онлайн книгу Military Waste - Joshua O. Reno страница 13

Автор:
Жанр:
Серия:
Издательство:
Military Waste - Joshua O. Reno

Скачать книгу

to admit to the client that they need to alter the initial contract. This is the least desirable of all options and can therefore make working conditions and social relations difficult, even “hostile” at the plant. Several of the men I interviewed partly blamed this for a high turnover rate among young engineers.

      Finally, some product designs inevitably become obsolete, even before they are supplied to the military, simply because of the nature of contemporary war-readiness. This cannot be explained by decisions or difficulties on the part of either client or manufacturer. It represents, rather, the inescapability of entropic loss as a general principle in the material world. This in theory presents problems to any creator of any product, but is amplified in the context of a permanent war economy that demands continual improvements and updates. Simon first explained this to me:

      Another aspect. . .is that this helicopter is flying with six million lines of code or so, because each box is a computer, the radar is a computer, the torpedo is a computer, the bombs are computers, the sonal buoys are computers, and they all have to work together. And a lot of that equipment is obsolete. When the aircraft first gets accepted by the Navy, the computers on board are usually ten years obsolete.

      According to Simon, it is code that goes obsolete the quickest. This would appear to be akin to unplanned obsolescence, assuming producers did not intend for this to happen and that changes in computing are at least somewhat unpredictable. Ironically, creating a product that is already useless waste can occur precisely by attempting to avoid this possibility in pursuit of quality:

      That is because of all this testing, and manufacture. It starts obsolete. When you plan for the aircraft, ship or whatever, you plan for it starting obsolete, and that’s why they have a phase one, a phase two, a phase three of upgrading it. And that’s where you make the money, is in the upgrades. There are classified war games, computer simulations, that you go through. And you know the computer systems of your opponents.

      Simon depicted military equipment that was obsolete as soon as it was sold, a perfect illustration of the dynamics of endless innovation that Baran and Sweezy warned of more than half a century ago. But firms were aware of this: it meant constantly tending to products after they were sold, endlessly producing them in a creative process that never ends until they are decommissioned, destroyed, or both. For this reason, good management, for Eddy (an engineer retired eleven years), meant recognizing that military production is “managing change,” constantly adjusting as products evolve, and as costs and schedules change along with them. Whether the demand to change emerges through interactions with people, process, or product, managing it means trying to avoid waste in its various guises and to preserve value, relationships, and community more broadly.

      Faced with change and uncertainty in their work process, the engineers and accountants I spoke with took pride in their own effort and expertise, first and foremost. That a project might fail to materialize, might be obsolete upon release, or might cost more than initially projected were out of their control by comparison.

      ETHICAL DIMENSIONS: RISK AND BLAME

      Experiences of responsibility or blameworthiness arise through social interaction, because ordinarily people only think to provide explicit justifications for what they do when impelled to do so by others (Keane 2016, 78–79). And distinct social situations can direct and deflect blame as a result. Inside the microworld of arms manufacturing and military contract procurement, for instance, one is more likely to be called upon to explain one’s work performance in certain ways and not others. As mentioned, in public discussions of the permanent war economy, blame is normally assigned on the basis of perceived self-interest—politicians want reelection, military personnel want more power, and corporations want more money. But in military production, blame is determined internally through product testing and audit.17 Put differently, testing products is a reflexive practice with greater social implications, beyond customer ties alone. That is because, in addition to demonstrating technical reliability that workers may take pleasure in, testing is an ethical activity that offers preemptive defense against blameworthiness, an alibi that allows engineers to manufacture products with a belief that they did the best they could.

      Generally speaking, corporations in a neoliberal era use auditing practices to reduce various risks to the durability and reliability of their products by increasing the predictability of their processes, including employee performance (see Shore and Wright 2000, 85). What this means for Lockheed employees is that at various stages in the design of a product they must reflect on product success in specific ways even before they pitch a project to the military. The more important the product, the more that is invested in predicting and preventing (or rather delaying) the product’s failure and therefore mitigating any person’s sense of accountability. More scrutiny with safety-critical products means more testing of software itself, but not necessarily more environmental testing. Simon explained this to me in detail:

      There’s a formal thing called a risk analysis, and you do a twenty-year risk analysis. This is a brand-new aircraft, brand new radios, brand new mechanical, computer brain that’s gonna be self-aware. What are going to be the technical challenges? All these things either won’t be built, can’t be built. . . what’s the plan? So you do a formal risk analysis and say, “At this point in time if we don’t have this design mature enough, we have to go to plan B or plan C.” And this formal risk analysis is approved.

      Here, formal risk analyses translate “waste” into a series of tests and procedures that arguably deflect responsibility by distributing the labor involved in creating instruments of violence. Testing and auditing also mean inviting other people into the product assessment process:

      Another thing Lockheed Martin does (and I assume other companies do) is, you present your design to an internal panel of what’s called “Wise Old Owls.” These are old scientists and engineers that have been through all of it before, and you first do a red team and a black team and they can shut down the Lockheed Martin effort and fire all us engineers if they don’t think we’re gonna win.

      These internal “gatekeepers” monitor and assess projects and can have them thrown out for technical reasons, but Simon did not seem concerned about the human or financial cost, likening it to a form of blind peer review (perhaps because I was his interlocutor, and he was searching for an analogous domain of socially mediated audit).

      If internal competition and risk analysis can lay waste to projects, in anticipation of their presentation to the client, this is actually meant to save time and money in the long run. If a project is committed to for twenty years, then even more capital, human and financial, is at stake. But too much process analysis can also be wasteful, from the perspective of Lockheed workers. The new face of manufacturing is not about being productive, Bork told me, but predictive. He said a lot of time and resources are now invested in cultivating reflexive attention to processes. What does this look like in practice? As Bork put it:

      So that you have a set of rules that will always give you an expected result coming out. “I have this much time to develop this, and based on my processes, I know I need this many people, this amount of time and this amount of money, and in the end I’ll have what I’m asking for.” So it gives you predictability, that’s the big thing.

      If a process was supposed to take you an hour but actually takes much longer, you want not only to shorten the amount of time it takes, but also to understand why you had it wrong the first time. This means asking not only what happened, but also how one can improve in the future. This is not bad, necessarily, but since new approaches are rarely implemented uniformly or perfectly, it ends up taking even longer to do one’s work. It is easy to get cynical about predictive management for this reason:

      You’ve got the new darling method of development that’s making the rounds right now: Agile. This is the latest round of Kool-Aid. Every six or seven years, there’s some new paradigm of how to develop software that makes the rounds, and it’s

Скачать книгу