The Bleeding Edge. Bob Hughes
Чтение книги онлайн.
Читать онлайн книгу The Bleeding Edge - Bob Hughes страница 5
The range of a modern economy’s inequality is astonishing and all of it is packed into its most popular products. As technology advances so does the range of the inequality that’s drawn into its web. Must it be so? Today’s iconic electronic products, like yesterday’s cotton ones, embody the greatest range of human inequality currently possible. Most of us know at least some of the facts: the toxic waste mountains; the wholesale pollution of the environments where the copper, gold, tin and rare-earths are extracted, in countries where life itself has never been so cheap; the sweated labor; and so on.
TWO PARADOXES ABOUT NEW TECHNOLOGY
Yet here’s the first of two key paradoxes: when you look at what actually happens when technological progress is made, you find very little to support the idea that progress demands inequality – and even some mainstream economists recognize this. World Bank economist Branko Milanovic, for example, concluded a large-scale study of inequality and economic growth in history like this:
The frequent claim that inequality promotes accumulation and growth does not get much support from history. On the contrary, great economic inequality has always been correlated with extreme concentration of political power, and that power has always been used to widen the income gaps through rent-seeking and rent-keeping, forces that demonstrably retard economic growth.3
This is especially and manifestly true when looking at the present system’s ‘jewel in the crown’: the computer. The thing we know (or think we know) as ‘the computer’ emerged in conspicuously egalitarian settings, and it wouldn’t go on functioning for very long if inequality ever succeeded in its quest to invade every nook and cranny of the industries that support it.
The computer in your hand may have arrived there via a shocking toboggan-ride down all the social gradients known to humanity, but inequality was conspicuously absent at its birth, largely absent during its development, and remains alien to computer culture – so alien that the modern economy has had to create large and expensive ‘egalitarian reservations’ where the essential work of keeping the show on the road can be done in a reasonably harmonious and effective manner. The New Yorker’s George Packer has described4 how the leading capitalist companies (Google, Microsoft, Apple and the like) have even built their own, luxurious, egalitarian ‘villages’ and ‘campuses’ where their programmers and other creative types are almost totally insulated from the extreme inequality around them, and can believe they have moved beyond capitalism into a new egalitarian age.
More than half of the world’s computers and smartphones, more and more of its electronic appliances, and nearly all of the internet, depend on software created by freely associating individuals, in conscious defiance of the management hierarchies and profit-driven, intellectual property (IP) regime that underpin giants like Apple. Richard Stallman, founder of the ‘Free Software’ movement, sees any attempt to take ownership of the process as an affront to humanity. Of intellectual property law, Stallman has said:
I consider that immoral… and I’m working to put an end to that way of life, because it’s a way of life nobody should be part of.5
Free-market hawks may sneer at such idealism but their world would simply not exist without people like Stallman. Even if it did, it would not work very well without Stallman’s brainchild, the computer operating system known as GNU/Linux, and the global network of unpaid collaborators who have developed and continue to develop it. Google itself is built on GNU/Linux, as are Facebook and other social-media sites, and even the computers of the New York Stock Exchange: GNU/Linux is faster and more robust than the commercial alternatives.
Stallman wrote and published GNU in 1983 as an alternative to the older, established Unix operating system,6 with the difference that all of the code was freely available to anyone who wanted it, and could be changed and improved by anyone capable of doing so (hence ‘free and open source’, or FOSS). Stallman’s only stipulation was that nobody could own the code, and any modifications must be shared. GNU became ‘GNU/Linux’ after 1991, when a teenage fan of Stallman’s work, Linus Torvalds, started to circulate the code for the ‘kernel’ that allows GNU to run on different kinds of computers.7 This made it possible to use GNU/Linux (now generally known simply as Linux) on just about every kind of computing device that exists, including automobile engines, avionics, industrial appliances, power stations, traffic systems and household appliances.
The second paradox is that, while the new technologies are in principle supremely parsimonious, their environmental impact has turned out to be the exact opposite. Each wave of innovation needs fewer material inputs than its predecessor to do the same amount of work – yet in practice it consumes more resources. The Victorian economist William Stanley Jevons (see Chapter 5) was the first to draw attention to this paradox, which now bears his name – and it becomes even more striking when considering all the industries and activities that depend on or are mediated by electronics and computers. As economic activity has been computerized, it has become more centralized, and its overall environmental impact has increased – as have control by capital of labor and of people, the wealth-differences between rich and poor, and the physical distances between them.
Is this mounting impact an inevitable ‘price of progress’, or is it the result of progress falling into the hands of people and a system that simply cannot deal with it responsibly?
WHAT IS TECHNOLOGY ANYWAY?
It is important to challenge two conventional assumptions that are often made about technology: first, that we have capitalism to thank for it; and second, that it follows a predetermined course, that the future is waiting to be revealed by clever minds and that progress ‘unfolds’ from Stephenson’s Rocket to the automobile, DVDs and the iPhone.
The economist Brian Arthur, who has made a lifetime study of technological change, argues that human technology is a true, evolutionary phenomenon in the sense that, like life, it exploits an ever-widening range of natural phenomena with ever-increasing efficiency: hydraulics, mechanical, electrical phenomena and so on. He defines technology as:
a phenomenon captured and put to use. Or more usually, a set of phenomena captured and put to use… A technology is a programming of phenomena to our purposes.8
Technology develops through greater and greater understanding of the phenomena, and what they can be made to do, and how they can be coaxed into working together. Arthur uses the analogy of mining: easily accessed phenomena are exploited first (friction, levers) then ‘deeper’, less accessible ones (like chemical and electrical phenomena). As understanding of the phenomena deepens, their essential features are identified for more precise exploitation: the process is refined so that more can be done with less.
As the ‘mining of nature’ proceeds, what once seemed unrelated ventures unexpectedly break through into each other’s domains, and link up (as when magnetism and electricity were discovered to be the same phenomenon in the late 18th century). No technology is primitive; all of it requires bodies of theory, skill and experience; it tends inexorably to greater and greater economy of material means. He describes how phenomena – for example, friction being used to make fire – are exploited with increasing efficiency as they are worked with, played with and understood.
The parallels with biology are striking. Technology is just like a biological process – and there is a tendency at this point (which Arthur goes along with somewhat) to start thinking of technology as ‘a new thing under the sun’ with a life