The Handbook of Peer Production. Группа авторов
Чтение книги онлайн.
Читать онлайн книгу The Handbook of Peer Production - Группа авторов страница 42
It is important to note that the software and hardware of peer production enclose an institutional dimension sui generis (Lessig, 1999). As virtually all projects rest on a technological infrastructure, design implications are essential instruments that configure the agency of contributors. The programs and algorithmic procedures set down in code are used in order to materialize regulations and social norms by way of organizing access to technical features as well as to the programming facilities themselves (Kesan & Shah, 2005).
Studying the institutional conditions in Wikipedia, Butler, Joyce, and Pike (2008) distinguished a set of perspectives among editors. One way of defining institutions was as rational efforts to achieve consistent and reliable decisions and to codify role positions and duties. From a different perspective, they were taken to represent evolving, competing entities which propagated themselves: rules generate more rules. Another view accentuated the construction of meaning and identity that defines the character and ambition of the project. Institutions were furthermore framed as external signals that indicated to audiences or users not actively participating that the project attends to problems, and finds ways to address them. They were also regarded as being internal signals that raise awareness for topics or perspectives and draw boundaries of a project’s inside and outside. Wikipedia’s set of rules, norms, and basic understandings was specified in terms of negotiated settlements and trophies that mark the end of conflict, or to signal binding consensus. Finally, institutions also served as control mechanisms set in place to ensure appropriate action.
Overall, institutions in peer production help to order the dispersed engagement of volunteers and to bring together meaningful, valuable outcomes. They should facilitate, not suppress productive engagement. In open projects with unsolicited membership and the constant exit option, we often find a plastic interpretation of institutions, not a strict enforcement though there are also institutionalized forms of sanctioning and ostracizing users. The flexible handling of rules and social norms pertains to the interests and agendas of the users involved. Hence, institutions in peer production are sites of conflict and specification and thus form part of editorial power plays (Kriplean et al., 2007). Besides, they are a matter of socialization and instruction (Viègas et al., 2007). In this regard, Gabriella Coleman (2013) referred to an “ethical enculturation” (p. 124) she encountered among the contributors to the free and open source software project Debian. In order to stay relevant and mirror the requirements and concerns of the users, the institutions had to be actively practiced and passed on to incoming participants.
3 What Rules and Norms? Policies, Guidelines, and Basic Understandings in Peer Production
Institutions in peer production can be classified into those oriented towards products and those centering on work processes. Thus, they encompass, on the one hand, content standards about the form and quality of the generated and delivered goods and services. On the other, they include interactional standards and procedural standards that arrange the cooperation among project members. Reflecting the institutional register of regulations, norms, and basic understandings, peer production projects have formulated cognate distinctions with different degrees of authority, from axiomatic principles and enforced rules to advice or cues. They either prescribe, explain, or suggest correct forms of conduct and valid contributions to the project, respectively. In sum, they define a more or less strict scope of activity. This incorporates, along a decreasing level of exigency and validity, actionable rules, moral tenets, and non‐binding musings.
Somewhat exemplary for a number of other peer production efforts, the English‐language Wikipedia features three levels of policies, guidelines, and essays. The online encyclopedia rests on a core set of obligatory “five pillars” that originate from the beginnings of the project in 2001. One precept determines “What Wikipedia is not,” and thus the content scope of the articles. A second specifies the so‐called “neutral point of view.” It demands authors to represent all significant views fairly, proportionality, and without bias. The third principle, “Wikipedia is free,” states the copyright status of the project that allows anyone to edit, use, modify, and distribute. “Civility” as the fourth axiom reminds the contributors to respect each other. This includes the policy to “Assume Good Faith,” which requires editors to treat and think of others well. Hence, this catalyst for cooperation works, Reagle (2010) explained, thanks to the “dovetailing of an open perspective on knowledge claims (epistemic) and other contributors (intersubjective)” (p. 161). To this end, the authors are framed as being cooperative, goodwill contributors striving towards productive joint work. In the English Wikipedia, the set of fundamental ideas is completed with the call: “Be bold.” That way, the authors hope to account for the evolving character of their trade. Participants should first of all aim to improve Wikipedia which might then also mean to scrutinize, adapt, or suspend existing policies and guidelines.
Although these five rules are marketed as being central and unchangeable, they nevertheless vary to some extent from one language version to another. For instance, in the German edition, the fifth maxim is missing. Further core institutions like the wikiquette, which has not yet congealed in a Wikipedia code of conduct as in Debian, attended to the conduct among users and therefore to the desirable manners and forms of social interaction. They required Wikipedians to be nice to each other, to be honest, and to abstain from legal threats. Other policies, guidelines, and essays dealt with, for example, the resolution of disputes, the organization of editing, and the handling of vandalism. They rest on regulatory practices that date back to the time of early Usenet applications and mailing lists (Baym, 1996; Sternberg, 2012).
In sum, they form the “Wikipedia policy environment” (Morgan & Zachry, 2010, p. 165), that has evolved in reaction to the editors’ need to handle emerging contingencies. The majority of policies tended to focus on process and legal issues whereas guidelines often dealt with content matters, and essays were mostly dedicated to user behavior. Like in other peer production ventures, Wikipedia’s “genre ecology” (Morgan & Zachry, 2010, p. 165) of policies, guidelines, and essays is mushrooming in character (Butler, Joyce, & Pike, 2008; Halfaker et al., 2013). In January 2008, Morgan and Zachry (2010) sampled 47 policies, 232 guidelines, and 404 essays. In his survey of March 2009, Reagle (2010a) found 686 pages in page categories relevant for organizing collaboration with 104 proper rule pages. Five years later, Jemielniak (2016) collected more than 1,200 regulatory documents in the English‐language edition and counted 150,000 words in the 50 main policies.
The three groups of Wikipedia institutions provide no clear‐cut and exclusive order but rather are a “helter‐skelter hodgepodge” (Kostakis, 2010) with different levels of validation, integrity, irreversibility, and legitimacy. The extensive and somewhat haphazard corpus of provisions, conventions, and personal advice or opinions seeks to stipulate aspects of working on and in Wikipedia (Tkacz, 2016). Some purvey concrete descriptions of procedures and rationales for making decisions, others instead are composed of formulaic mottoes without tangible directives. They are complemented by a repository of software utilities created for aligning the content and patrolling contributions according to the standards in place. A vast number of robots or bots are, for example, maintained in order to handle menial tasks like fixing broken links or correcting typos. In Wikipedia’s “sophisticated technomanagerial system” (Niederer & van Dijck, 2010, p. 1373), these tools, in combination with semi‐autonomous editing interfaces, increasingly support the enforcement of rules, for instance, when they facilitate the detection and reversion of vandalism (Geiger & Ribes, 2010). In this way, technological assistants assume an increasing share of the contributions and also sustain the implementation of institutions. In July 2017, bots made about 20 per cent of all edits (Geiger, 2017). Referring to the technical protocols that steer the activity of bots, Müller‐Birn, Dobusch, and Herbsleb (2013) spoke of an “algorithmic governance” (p. 80) sustaining