Code Nation. Michael J. Halvorson

Чтение книги онлайн.

Читать онлайн книгу Code Nation - Michael J. Halvorson страница 13

Автор:
Жанр:
Серия:
Издательство:
Code Nation - Michael J. Halvorson ACM Books

Скачать книгу

as a framing term, the methodology would have an important impact on programming culture in the coming decades.9Computing mythologiesNATO Conference on Software Engineering

      Let’s take a look at the problems software developers wrestled with in the 1960s. At that time, the dominant paradigm for software development revolved around a solitary (male) computer genius enigmatically holding court with a small team of assistants. The consensus at Garmisch was that this scenario needed to be replaced with a cohort of systematically-trained engineers, responsible to management, who practiced structural thinking and followed orders. (A visual model of this hierarchy and approach can be seen in Figure 2.2.) As Nathan Ensmenger summarized, “Software engineering promised to bring control and predictability to the traditionally undisciplined practices of software development.”10

      Historian Stuart Shapiro analyzed the legacy of this Engineering movement “engineering movement” as it gained momentum in the 1970s and 1980s. According to Shapiro, the program was less about specific technical procedures and more about finding ways to regulate and standardize growing project complexity, budgets, and the new cadre of software engineers who had until recently attracted little notice. New approaches to programming took center stage in the push to make software development outcomes more reliable. These strategies included the use of Structured programming structured programming techniques, adding language features to promote reliability, measuring software performance (metrics), and using IDEs.Integrated development environments (IDEs) Integrated development environments (IDEs) integrated development environments (IDEs).11 Most of these ideas would make their way into personal computing, too, although it would take a decade or more for the engineering practices to take hold.

      Computer programming is sometimes envisioned as an individual task, but by the late 1960s, commercial software was typically constructed in groups. For example, in mainframe computing environments like the IBM IBMSystem/360 System/360, it was necessary to hire an army of analysts, technicians, and software developers to build and maintain non-trivial systems. Productivity gains associated with the “Division of labor” principle “division of labor” principle simply did not work when dividing up the tasks of a large software project. The new approach had to involve teaming. Grace Hopper subtly introduced this concept when she created the world’s first computer language compiler in 1952, known as the A-0 compiler A-0 system, which translated symbolic codes into Machine language machine language. Hopper completed the first draft of her work and then immediately shared it with associates to see if they could make improvements, a strategy that she also followed during the creation of FLOW-MATIC FLOW-MATIC, the predecessor to COBOL. Computing mythologiesNATO Conference on Software Engineering

       figure

      Figure 2.2By the 1960s, many engineering groups were using structure and clearly defined roles to bring predictability to their projects. The team that created the DEC PDP-6 computer was led by C. Gordon Bell, the man wearing a suit jacket in this 1964 photo.

       Standing L–R: Peter Sampson—Operating System Programmer, Leo Gossell—Diagnostic Software Programmer, C. Gordon Bell—PDP-6 System Designer (Creator), Alan Kotok—Operating System Lead, Russell Doane—Circuit Design Engineer, Bill Kellicker—Programmer, Bob Reed—Hardware Technician, George Vogelsang—Draftsman.

       Sitting L–R: Lydia Lowe (McKalip)—Secretary to C. Gordon Bell, Bill Colburn—PDP-6 Project Engineer, Ken Senior—Field Service Technician, Ken Fitzgerald—Mechanical Engineer, Norman Hurst—Programmer, Harris Hyman, Operating Systems Programmer. (Courtesy the Computer History Museum and DEC)

      An underlying current of the 1968 NATO conference was that building software entailed a level of complexity that few fully recognized when digital electronic computers hit the scene in the 1950s. Computing mythologiescomplexity of software

      But what made computer software so complex?

      Let us start with a formal definition. Computer software is one part of a computer system that consists of data and computer instructions. Computer software is distinct from computer hardware, or the physical components of a computing system, such as the processors, circuits, boards, panels, monitors, switches, and other electronic devices in a machine.

      A basic understanding of what software consists of changes over time, so it is useful to visualize a list of items that has taken shape in evolving contexts. Modern software includes a wealth of program types (operating system components, device drivers, application software, games, programming tools, malware), as well as supporting libraries, data, images, sound recordings, videos, email messages, Facebook posts, Tweets, and all manner of digital media. A Softwarerelease software release typically consists of a bundled collection of items with a particular purpose, including a setup program, executable files, and hundreds (or thousands) of installed components, digital media files, documentation, and other resources.

      From a business point of view, software may be considered a commercial product with economic value and utility, such as the popular applications GarageBand for iOS, Adobe Photoshop, or Microsoft Office. Software may also be distributed for free, such as open-source software or freeware. In these many contexts, one piece of software is distinct from another on a computer system, in the marketplace, or (under certain conditions) in copyright law. As recent historians have also discovered, each piece of software has its own history and impact, its creators and users. Software carries cultural memories and a society’s hopes and fears.

      In the 1960s, most software programs were supplied for free with the expensive computer systems that organizations purchased or rented from hardware manufacturers. In the U.S., IBM was the leading computer manufacturer by a large margin, followed by successful electronics firms like Burroughs, UNIVAC, NCR, Control Data, Honeywell, General Electric, and RCA. Corporations used mainframe computers for complex calculations, resource planning, bulk data processing, and transaction processing, including managing shipping, payroll, and employee records. In these many contexts, organizations needed to adapt the free software that they were given to match the needs of their customers. They needed to hire and train programmers and technicians to accomplish this work.

      As computers grew and took on more tasks, the ailments plaguing software could often be traced back to one principle cause—system complexity. The Systemcomplexity complexity of software was engendered by programming’s abstract nature and the scientific principle that a program constitutes a digital (discrete state) system based on Computing mythologiescomplexity of software mathematical logic rather than an analog system based on continuous functions.12 As software systems were being constructed with growing sophistication, project designers needed to consider numerous interrelated factors in their solutions, including the organization’s list of requirements for a system (clearly stated or not), operating constraints related to hardware and software platforms, technical conditions within the computer itself (including memory resources), and the wide range of possible inputs and outputs that a program might encounter as it completed its work.

      Real-world computer systems

      Real-world computer systems were designed so that they used only a prescribed set of resources, such as memory and processor time. From the point of view of the programmer, additional complexity arrived in the selection of programming languages, data structures, Algorithms algorithms, flow control mechanisms, error handling structures, and the use of inherited source files and legacy code from other projects.

      Individual computer programmers also brought their own tastes and psychological experiences to a

Скачать книгу