Platforms and Cultural Production. Thomas Poell

Чтение книги онлайн.

Читать онлайн книгу Platforms and Cultural Production - Thomas Poell страница 6

Platforms and Cultural Production - Thomas Poell

Скачать книгу

ambition with this book is to advance the theoretical framework we introduced in our initial New Media & Society article published in 2018. This framework is developed further in the first half of the book and is now recast as an institutional perspective on platformization. At the same time, we are committed to doing justice to the wide variety of emerging cultural practices that can be observed across platforms and regions of the world. These emerging practices are just as much part and parcel of the processes of platformization as are institutional changes in markets, infrastructures, and governance. From the perspective developed in the second half of the book, platformization involves vital shifts in practices of labor, creativity, and democracy in the cultural industries. Overall, the book aims to provide researchers and students working at the intersection of platforms and the cultural industries with a comprehensive framework to systematically examine and compare the particular industry segments and practices that they are studying.

      Thanks as well go to the tutorial students, whose feedback helped us to enhance the focus of the book at an early stage of its conception: Lukas Beckenbauer, Jueling Hu, Daphne Idiz, Vanessa Richter, and Ziwen Tang. The students of the research master Media Studies at the University of Amsterdam, meanwhile, provided rich feedback on the first draft of the manuscript in the course of Research Practices in Media Studies (2020–1). Additionally, we are grateful to Mary Savigar, Sarah Dancy, Ellen MacDonald-Kramer, and Stephanie Homer from Polity, who patiently and steadily guided and supported us through the writing and production process. We would also like to acknowledge the generosity of a number of institutions that sponsored this project: the Social Sciences and Humanities Research Council of Canada (SSHRC), the Amsterdam Centre for Globalisation Studies (ACGS), the Cornell Center for Social Sciences (CCSS), the McLuhan Centre for Culture and Technology, Queensland University of Technology, the University of Amsterdam, the University of Toronto, and Cornell University. Collectively, such support made for an all the more generative collaborative process by allowing us to work together in person – at least, until most of the world became gripped by COVID-19. To this end, we would like to close by expressing our appreciation to the family members, friends, and colleagues who provided support, encouragement, and patience, especially as the trials of researching and writing a book intensified under the weight of a global pandemic. Heartfelt thanks, in particular, go to Emma, Jonathan, and Raphael Poell, Robert Shea Terrell, and Leslie Pilszak.

       Amsterdam, Toronto, Ithaca, 2021

      “Big brands fund terror,” read the frontpage of the British daily newspaper The Times on February 9, 2017; below the arresting headline was a screengrab of an online ad that – unbeknownst to the client – appeared in a YouTube video openly endorsing jihadists (Mostrous, 2017). According to The Times investigation, YouTube’s automated system of placing ads had paired promotions for consumer products and charitable organizations with videos championing radical and terrorist groups, including the Islamic State and Combat 18, a pro-Nazi faction. Several weeks later, the Guardian followed up with a report on the six-figure sums that “hate preachers” had generated from YouTube’s unwitting arsenal of ad sponsors – among them household brands like L’Oréal, Sainsbury’s, Nissan, and even the Guardian itself (Neate, 2017). Indeed, the report chronicled a kaleidoscopic range of extremist content funded through the platform: anti-Western propaganda from a Salafi Muslim preacher, videos by former Ku Klux Klan imperial wizard David Duke, and anti-LGBTQ and anti-Semitic sentiments expressed by a fundamentalist pastor.

      Asked to respond to the high-profile social media scandal, Ronan Harris, a representative for YouTube’s parent company Google, offered: “We believe strongly in the freedom of speech and expression on the web – even when that means we don’t agree with the views expressed” (Neate, 2017). While Harris went on to clarify that Google’s policies prohibit “videos with hate speech, gory or offensive content” from appearing adjacent to ads, he conceded that “we don’t always get it right.” Dissatisfied with Google’s rhetorical deflection, the Guardian – along with the BBC and the UK government – subsequently pulled all advertising from the video-sharing platform.

      While these changes appeased advertisers – at least temporarily – they introduced considerable angst and uncertainty into the professional lives of cultural producers, in particular those creators vying with one another to earn income from the oft-elusive YouTube Partner Program. Many creators abruptly found their content “demonetized,” meaning they would receive limited or no ad revenue in exchange for audience attention (Caplan & Gillespie, 2020). Creators who provided mere commentary on “sensitive” social issues were especially susceptible to financial retribution. The same applied to creators whose content contained “strong profanity used multiple times … even if bleeped or for comedy, documentary, news, or educational purposes” (YouTube, 2020b).

      In addition to demonetizing content deemed contentious, YouTube substantially raised the threshold for participation in the Partner Program: only channels with at least 1,000 subscribers that had ratcheted up more than 4,000 public watch hours in the preceding year were allowed to participate (YouTube, 2020c). This policy update made it especially difficult for newcomers to generate income, while barring creators with smaller followings altogether. The exclusionary nature of YouTube’s advertising program was exacerbated by a new rule which stated that demonetized clips were only eligible to be reevaluated by a human reviewer if they had a minimum of 1,000 views within a week (YouTube, 2020c). For context, given the mind-blowing amount of material on YouTube – 500 hours of video are uploaded every minute1 – content categorization and labeling take place through automated, rather than human, systems of content moderation (Covington et al., 2016; see also, Kumar, 2019; Roberts, 2019).

Скачать книгу