Breaking News. Alan Rusbridger
Чтение книги онлайн.
Читать онлайн книгу Breaking News - Alan Rusbridger страница 25
The consultants were back at the Guardian, this time looking at each individual day of the week to work out its contribution to the bottom line. The greatest immediate worry was still over the anticipated flight of classified advertising from print. Companies were beginning to do their own recruitment through their own corporate sites. There were numerous new competitors on the block. Finally, there was a shift to CV searching rather than advertising.
The good news was that we were no longer the minnow in UK terms. For its entire history the Guardian had languished at or near the bottom of every league table of readership. In February 2002 the Nielsen NetRatings survey placed the Guardian at number one. It measured us at 934,000 users against 286,000 for the Telegraph, and 249,000 for both the Times and the Sun.
That was cheering, but had to be seen in a wider context. We ranked at only 50 in the same rankings of the UK’s 100 biggest domains. We were the only newspaper in the ranking, but we were well outside the premier league of big sites and advertising networks. Price Waterhouse Cooper reckoned 80 per cent of all digital spending was going to the top ten sites. We were still nowhere near.
The areas that had previously been profitable for newspapers – such as IT, motoring and finance – were now in competition with specialist portals on niche websites. In the old world, advertisers – like journalists – had only a rough sense of who was reading what. In the new world, every advertisement was measured on the effectiveness of its response. ‘We have a problem,’ wrote the forlorn commercial director, ‘that people come to our sites to read stories – not click on ads. Our response rates are very, very low – particularly on our news site, where we have most inventory. This means we often find ourselves falling off schedules after advertisers evaluate their performance on our sites.’
It was a prescient note. But we were not alone at the time in not divining what Plan B was.
Still, none of the British competitors were serious about charging – not least because there was so much good, free, English language content available, including the public broadcasters, the BBC and ITV News, and numerous American portals (MSN, Yahoo, AOL, etc.) and news sites. The Irish Times had recently tried a paywall, found they ended up with an audience of just 6,000, and promptly dropped it. Peter Chernin, the President of News Corporation, had recently admitted that they could see ‘no viable business model that works’ for the internet. The UK newspaper websites were, he said, nothing more than adjunct, promotional vehicles for the newspapers.
We had further debates about long-term reach or short-term revenue. The Board didn’t think too long about which route to take. The revenue route involved the Guardian probably having a readership not much bigger than the New Statesman magazine.2 It was impossible to imagine breaking further into the American market with a paywall or to hope that younger readers would contribute. The advertising revenues were speculative, at best. So a revenue strategy looked like a route to a niche UK-only publication for older readers.3 The fact that, at this stage, virtually no other general news publisher in the UK or US was prepared to go for revenue over reach suggested that their boards were all looking at a similar dilemma and coming to the same conclusion.
In time future commentators would describe this period as the moment of ‘original sin’. The phrase was coined by Alan Mutter, an expert in media economics and management, who blogged in February 2009: ‘the Original Sin among most (but not all) publishers was permitting their content be consumed for free on the web’.4 He made comparisons with the ‘far more proactive’ music industry, though that was hardly a happy example, with ten songs downloaded illegally for every song purchased lawfully.
‘By the end of the decade,’ wrote the Spanish academic Ángel Arrese, ‘the general consensus in the world of the press was that free online news content would attract massive readerships, whose accumulated attention could be sold efficiently to advertisers.’ He went on to describe a period (2001–7) of what he calls ‘the frenzy of failed trials’ following the bursting of the so-called dot.com bubble.5
Newspaper executives, seeing the bubble burst and the old model teetering, returned to insisting on payment – including subscriptions, e-paper, pdf formats, premium content, and micropayment. Some (including the Guardian) tried charging for particular features (in our case, the crossword). Others ‘closed’ their websites completely: El País in Spain was an early pioneer and managed to attract 45,000 paying subscribers. It dropped the barrier within three years: there was no business that worked on such low numbers. The NYT abandoned a similar attempt at charging for access to columnists and the archive.
‘By the end of 2007,’ wrote Arrese, ‘a general consensus existed that free content would remain the standard way of commercial exploitation of digital news.’
*
In October 2002 one or two colleagues stumbled across a new service called Google News. One self-mockingly responded by slumping across his desk in a pose that suggested there was little point in carrying on. How could one compete with this all-seeing eye on the world, hoovering up anything that happens, anywhere on the planet and alerting users within minutes in a remorseless perpetuum mobile of breaking information?
The most sinister thing from the point of view of the average desk editor sitting in Farringdon Road or Wapping or Canary Wharf was the little line at the bottom of the site: The selection and placement of stories on this page were determined automatically by a computer program.
That’s right: it was produced entirely by machines. Not one human being was involved in the generation of this modern, news version of the Doomsday Book. It was all produced by a cyber spider which whisked its way around the world filtering news sites and linking to them. Quite how it was done was a mystery as closely guarded as the recipe for Marmite or Coca-Cola.
I discussed it with a colleague working on the Washington Post website. He shrugged: ‘None of us can predict anything more than six months out. This is TV in 1948.’
We began to debate what it meant to run both print and digital alongside each other in a way that acknowledged the qualities of each. If the net was so much better at handling large amounts of text, why try to compete? What was the point of a verbatim parliamentary report if the whole thing was online the following morning in Hansard? Why not just link to it? Why publish long official reports, speeches or inquiries? We can digest them, explain them, contextualise them and analyse them in print. But aren’t there better uses of space than to print them?
Similarly, with the online version, why simply replicate the paper, as if we were putting radio on television? Wasn’t our strategy right of building deeper, richer sites around specific subjects? You can cater for fragments of the paper readership who want more of the feature, issue or individual they most treasure. There were people who wanted more football or news from Brussels than we could ever serve in a newspaper. There were writers with individual followings that, we suspected, could dwarf departments. All the talk was of the Wall Street Journal technology blogger, Walt Mossberg, and how the Journal had upped his salary to nearly $1 million a year to stop him taking his blog elsewhere. Another high-profile blogger, Andrew Sullivan, set himself up as a blogging brand – launching the Daily Dish in 2000. It was said to be more profitable than Amazon.6
But this idea of seamlessly linking the paper and the website to exploit the best of both did rather depend on newspaper readers having access to broadband computers and vice versa. Every time we told paying newspaper customers that they could get more detailed,