9

Technoptimism hits the buffers

Far from being ‘weightless’, the computer economy is built upon hardware that has a huge impact on the earth. As components shrink in size, more and harder-to-obtain materials, and more energy, are needed to make them, creating mountains of toxic waste, and becoming waste themselves after just a few years. The system is pushing against a fundamental law of physics,, entropy, but this law offers a new, reliable measure of value, which we badly need.

The late 1990s saw a plethora of books proclaiming the arrival of a ‘weightless economy’. The promotional copy for one of the first of them, Diane Coyle’s Weightless World (1998) described it as ‘a world where bytes are the only currency and where the goods that shape our lives – global financial transactions, computer code, and cyberspace commerce – literally have no weight’.1 Other opinion-shaping titles appearing at around the same time included Understanding the Virtual Organization,2 and The Death of Distance: How the Communications Revolution will Change our Lives.3 The general idea was that tendencies such as ‘telecommuting’ and ‘teleconferencing’ would greatly reduce human impact.

But in 2002 a study by Eric Williams and colleagues at the United Nations University found that the weightless world was much heavier than expected.4 Manufacturing a memory chip weighing two grams required 1.7 kilograms of materials and fossil fuels. In a book published the following year, Williams and a colleague, Ruediger Kuehr, calculated that it took about 1.8 tons (1.63 metric tonnes) of materials to make a single desktop computer.5 These extraordinary figures arise mainly from the extreme levels to which materials have to be refined, in order to produce reliable devices at such tiny scales as the transistors that make up computer chips.

Suddenly, we see that the ‘weightless economy’ in fact depends on vast amounts of energy, and billions of tonnes of water and other materials, which have to end up either in the local environment or the atmosphere. The problem of toxic waste becomes more intense as the chips get smaller. Other researchers, including Jan Mazurek, Jim Hightower and Elizabeth Grossman,6 have been writing books and reports since the late 1990s showing how the problem drove chip manufacture out of its original (heavily polluted) homelands – California’s ‘Silicon Valley’ and upstate New York – and into places where disposal was easier: first, the poorer and more sparsely populated states of the southwestern US, and then the Special Economic Zones (SEZs) of Latin America and Asia.

The toxic nature of chip manufacture goes hand in hand with the product’s shrinking size: both are inevitable consequences of Moore’s ‘Law’, which was described in the last chapter and is not so much a real law of physics as a consequence of firms’ efforts to stay in a highly rivalrous semiconductor industry. A respected industry commentator, Jim Turley, has explained that chip makers are under acute pressure to cover the costs of a new ‘fab’ before it becomes obsolete, which they can only do by getting more and more chips out of each silicon ‘wafer’ (the thin slice of hyper-pure silicon from which the chips are made).7 This means smaller and smaller chips, from bigger and bigger wafers (currently 300mm diameter is standard, but soon 450mm will be the name of the game) – which means higher and higher risks that tiny impurities and other random factors could compromise production.

The chip makers do not pursue smallness for the sheer love of small things, or a desire to relieve the end-users of excess weight. The pressure to go smaller and yet smaller is built into the game they’re in. As Turley explained, chips are produced in such high volumes that their price falls rapidly after their first arrival on the market. Every manufacturer wants to cash in on the high initial price before it drops, hence constant pressure for higher densities, bigger wafers, and bigger, better fabs that can deliver them: a self-reinforcing cycle. The ever-higher cost of each new ‘fab’ has to be recouped in the same short time-frame – three to five years – and there is no breaking the cycle other than to get out of the industry (which many chip-design firms have done by ‘going fabless’ – but at the cost of surrendering bargaining power to the specialist ‘chip foundries’, who have deep enough pockets and strong enough nerves to stay in the game).

Chip makers also have to harvest their money quickly because of the need to set up ‘second source’ agreements with other firms. These help guarantee supplies of the new chip if demand proves greater than their own plant can meet, or if its output suddenly drops (through purity problems, for example). Without second-sourcing, their customers, the electronics manufacturers, might not risk adopting the new chip, but because of second-sourcing, the price is doomed to fall: the chip loses its rarity value, becomes ‘generic’ and the price drops massively.

The only way out of this trap is forward: bigger and more expensive fabs, doing more complicated things, at higher and higher risk, with more and more effort going into protection and hedging against risk (‘partnership’ deals; joint ventures; elaborate cross-licensing agreements) and more pressure to find techniques that allow the game to continue. There has never been an industry that required so many new techniques to be invented so rapidly, and brought into play, with as little constraint on cost – except perhaps the nuclear industry in its early days.

Like the nuclear industry, it has won special regulatory exemptions, so that firms can introduce potentially dangerous new chemicals into their processes as and when they see fit, without having them cleared by health and environmental protection bodies. The Clinton administration’s 1995 ‘Commonsense initiative’ is a prominent example: introduced at the behest of Intel Corporation, this reduced ‘the regulatory burden’ on the semiconductor industry by allowing chip firms to introduce new chemicals to their processes without prior clearance.8

THE TOXIC DEMANDS OF PURITY

Silicon is one of the cheapest and most easily obtained elements on the planet – but only in its oxide form, quartz. Sand is mostly quartz – and that’s where chips come from. Separating the silicon from its oxygen consumes a fair amount of energy: this is done by heating sand (quartz particles) with carbon. The resulting silicon can be between 99.0 and 99.5 per cent pure. This is good enough for most industrial processes, including some quite demanding ones like medical silicone for implants and heart valves, but it is nowhere near pure enough for microchips. At their tiny scale, 1.0 per cent or even 0.5 per cent of impurities would render the devices too erratic to be of any use. So impurities have to be reduced all the way down to below 0.0001 per cent: by four orders of magnitude.

This is done in a series of potentially dangerous, energy-intensive reactions involving vast volumes of chlorine and hydrogen (and both of these, also, must be rendered ‘hyper-pure’ beforehand, using yet more exotic, energy-intensive processes). Then follows the really expensive stage: slicing the silicon into ‘wafers’, and cleaning and polishing them to nano-scale perfection. This involves an ever-changing cast of toxic gases and other chemicals, and vast amounts of water. Every day in 2002, between two and three million gallons of water went into a typical ‘fab’, where it, too, was rendered hyper-pure by yet more energy-intensive processes, before being released back into the environment, with its old, harmless-to-humans impurities replaced by toxic new ones: acids used in etching the transistor itself into the silicon, masking fluids. The process involves a continually changing list of different chemicals, which often have to be hazardous because only highly reactive chemicals will do the jobs they are required to do, to the ultra-high tolerances required.

This is a pure and spectacular example of positional competition driving design: a game of ‘faster, smaller, more expensive’ that every chip manufacturer is obliged to play.

Smaller, faster microprocessors depend more and more on the so-called ‘rare earth elements’ (REEs). These are highly reactive, metallic elements whose special electrical properties allow the behaviors of glasses and metal alloys to be fine-tuned. Yttrium is used in computer displays, enhances conductivity in metals, and is important for the kinds of high-power light-emitting diode (LED) lights that are now so common. Cerium is essential for the highly polished glass used in the iPad and similar displays. Neodymium multiplies the power of iron magnets – hence, it became essential in the drive for smaller, higher-capacity disk drives, electric motors, speakers and earphones.

None of these elements is, in fact, ‘rare’; they are just extremely difficult to separate from the ores in which they are found. First, this is because they occur at low concentrations: according to research carried out by Kiera Butler for the magazine Mother Jones in 2012,9 the best-quality ores contain only between three and nine per cent of the desired element, so vast amounts of rock must be processed to extract a few kilograms. Second, the very properties that make REEs so desirable render extraction incredibly energy intensive: they form intricate and powerful bonds with other elements, so isolating them requires a succession of extremely energy- and chemical-intensive processes. Butler says that a typical extraction plant consumes a continuous 49 megawatts, and ‘two Olympic swimming pools’ worth of water every day’ – plus large amounts of sulphuric and other acids and solvents. The waste is radioactive as well as chemically toxic. Thorium, which is usually present, has a half-life of about 14 billion years (in other words, its radioactivity may outlast the Earth itself – which is a mere 4.5 billion years old).

Consequently, these elements cannot be produced in the quantities that industry needs anywhere near the sorts of places where they are eventually used.

Kiera Butler visited the Malaysian town of Bukit Merah, where a company called Asian Rare Earth (partly owned by Mitsubishi) had been disposing of millions of gallons of radioactive waste into the local waterways for 30 years, causing miscarriages, birth defects, leukemia and other cancers. These effects eventually provoked mass protests which achieved the plant’s closure – but neither a proper survey of the health effects nor a proper clean-up operation. All of the drivers hired by one of her interviewees to get rid of the waste had died young. What is more, in 2008 an Australian company, Lynas, gained approval from the Malaysian government to open a huge, new rare-earth refinery in the east-coast town of Kuantan, intended to produce a fifth of global demand. The ores will be shipped thousands of miles from a mine in Australia. Lynas claims this is justified by lower labor, chemical and energy costs but, as Butler pointed out, there were other possible attractions: Malaysia’s environmental-protection laws are much less stringent than Australia’s; its environmental movement, although it is a strong one, is nowhere near as well resourced or connected; and its government is much more amenable to corporate deal-making, having granted Lynas a 12-year tax holiday on future profits.

Butler explains that pressure for this kind of deal-making has intensified since China (which has the world’s main deposits of higher-grade rare earths, and produced about 97 per cent of the world’s supply in 2007) cut exports in 2010 by 35 per cent so as to support Chinese manufacturers. This is thought to be one more reason why so much electronics manufacture now takes place in China.

OBSOLESCENCE AND E-WASTE: A TOTAL SYSTEM

Computers and electronics products are classed as ‘consumer durables’, but their durability is more like that of fashion garments than that of washing machines, cookers or even cars, and much less than that of items like grand pianos, which are still largely products of a craft industry and so not commodities in the full capitalist sense. A Dell computer executive has even been quoted as saying that their products had ‘the shelf life of a lettuce’, in a report into working conditions in the computer industry by the Catholic charity Cafod in 2004.10 Like most other reports, it points out that the electronics industry has something else in common with the garment industry: its workforce is overwhelmingly poor, precarious and female, in telling contrast with the lavishly rewarded ‘virtual economy’ where the computer programmers work, which is just as overwhelmingly male.

It is convenient to blame these injustices on consumers’ allegedly insatiable desire for novelty. But obsolescence, which is beyond their control, drives the process, and the obsolescence, in its turn, is driven as hard as it will go by the men who head the computer firms. There is no fame or fortune to be had from making a computer that will serve its owner for as long, say, as his or her piano, guitar or favorite kitchen knife would – although there is no reason why it should not.

A computer may still work perfectly well, and its owner may type no faster than she did four years ago, but it cannot run the latest version of the software she relies on, which she must buy because everybody else is now using it: a pure positional phenomenon. Or she can no longer get spares for it (because ‘nobody’ now has machines that need them). Or her new printer won’t connect to it (and the old one can’t be repaired or upgraded, let alone sold, because there is no second-hand market for it). Or the guy who used to fix the computer has retired and the new guy only knows about the newer machines. Or she’s suddenly aware that everyone else is now communicating via Facebook, which is unusably slow on the old machine. Or the BBC has upgraded its web TV viewer, and you need the latest version of the Flash browser plugin if you want to see programs online, and Flash no longer supports machines as old as yours – because nobody uses them any more. And so on: a constantly shifting positional terrain.

Of course, the latest computers are highly desirable things – but their users have other desires (for stability, for example), which go unacknowledged.

As a result of all these positional factors, few people now keep their computers longer than 4.5 years,11 so that millions of functioning machines are dumped, of which only 12.5 per cent are recycled in any sense – a very broad sense, because this often means shipping them in containers to be broken up, often by children with hammers and bare feet, or just dumped in the deep blue sea. Scandals have been exposed again and again by groups like Greenpeace, the Silicon Valley Toxics Coalition (formed in 1982 when people living in the San Jose area of California discovered they were being poisoned by the supposedly ‘clean’ new industries springing up around them and were developing cancers), and the Seattle-based Basel Action Network (formed in 1997 to monitor the UN’s Basel Convention on toxic wastes).

The extraction-to-‘toxic trash’ cycle has been brought to public attention by, for example, the Basel Action Network’s report Exporting Harm of 2002, and by author-activists like Elizabeth Grossman, in her 2006 book Toxic Trash.12 The scandals have brought about some regulation, especially if they happened close to home, as with the spate of birth defects and cancers due to toxic pollution around Silicon Valley and the Niagara Falls area of upstate New York, in the early 1980s, described by Grossman.13 The general trend, however, is to ‘offshore’ the problem. Horrific illegal e-waste recycling operations are discovered from time to time in Africa, China and the Indian subcontinent – wherever there are people desperate enough to do the dangerous scavenging work that the waste-generating economies cannot afford to do, not to mention the brutal work regimes that destroy workers’ health, fertility, babies and often their environments as well, for pitifully low pay, or even no pay at all.

I will not dwell on this side of the ‘weightless economy’, which is described with the care it deserves in a number of places, including Elizabeth Grossman’s book, Jan Mazurek’s Making Microchips, Jim Hightower and colleagues’ Challenging the Chip, and Nick Dyer-Witheford and Greg de Peuter’s Games of Empire (which describes the extraordinary exploitation that goes on at every level in the computer games industry).14 Hsiao-Hung Pai’s Scattered Sand is based on interviews she travelled thousands of miles to obtain, with workers in some of China’s harshest industrial zones, whose lives prove to be every bit as extraordinary and heroic as those of the corporate leaders whose fortunes they have helped to build.15

DISPLACING THE PROBLEM TO AFRICA

Elisabeth Grossman notes that Europe has taken stronger measures than the US against toxics – but one wonders whether this merely reflects Europe’s lack of a substantial computer industry of its own to match the lobbying power of that in the US. At any rate, in a globalized economy, virtuous national or regional regulations can simply displace and even intensify the problem.

The European Union’s Restriction of the use of Certain Hazardous Substances (RoHS) directive of 2002 prohibited use of some of the more toxic materials, such as lead solder. This was achieved after campaigns that lasted years against very effective rearguard actions from the various industries that were affected. By the time it went though, manufacturers had already discovered satisfactory alternative tin-based soldering methods, which resulted in a tin rush in eastern provinces of the Democratic Republic of the Congo (DRC). This was a convenient source as there was little government there at the time and plenty of cash-hungry soldiers and militias to organize extraction (Hutu militias had made the eastern DRC their base following their expulsion from nearby Rwanda after the 1994 genocide). UK Channel 4’s reporter Jonathan Miller visited the Bisie mine, the main source of the tin ore, cassiterite, in 2005. A trader told him:

The miners work for nothing; the soldiers always steal everything. They even come to shoot people down the mineshafts. Yes, not long ago they shot someone. They force the miners to give them everything and they threaten to shoot anyone who argues. They’re always ready to shoot. We are really penalized. We earn nothing. But we pay a lot. The soldiers – they are all around us here, but they are in civilian clothes.16

Miller wrote: ‘This stuff is mined and portered by people who are like cannon fodder for our industries. Five armies have battled for control of Bisie mine in just five years. But still we buy it.’

This came on the back of an earlier scandal about the ‘coltan rush’ of 2000, which intensified and expanded the Congo’s ‘Great War’ (1998-2004) into hitherto unaffected parts of eastern DRC. Coltan (Columbite-Tantalite) is one of the ores of the metal tantalum, whose value had risen with the demand for better and smaller capacitors: the devices that store electrical charge in electronic devices. Tantalum capacitors are a major reason why modern electronic devices can be as small and as portable as they now are. Development scholar and writer Michael Nest explains in his 2011 book, Coltan, that Congolese coltan is a relatively minor source of the metal, but it is a convenient one: in some places the river mud is entirely coltan and in 2000 there was no shortage of willing and unscrupulous agents to organize extraction.17 The world’s main sources of tantalum, by contrast, are two huge, open-cast mines – one in Australia, the other in Canada – which are highly mechanized and cannot easily be adapted to meet the wilder fluctuations in demand. Informal sources are important to all industries, to smooth out the peaks of demand without incurring any further cost when the peak has passed.

During 2000, large stockholders and refiners had become concerned that they might not have sufficient tantalum should there be a really huge surge in Christmas demand for second-generation mobile phones and the new gaming devices, and placed additional orders for ore as a precaution. This caused prices to spike, from $30-$40 per pound at the start of the year, to up to $300 in September. In Congo, this gave some farmers (many of whom had been unable to plant or harvest crops because of the war) a chance to make badly needed cash; but then militias and cash-strapped military units operating from the Great Lakes region, and their backers, moved in and fought for control, making a substantial contribution to the Congo War’s estimated five-million death toll.

As Michael Nest explains, nobody gained anything from this mayhem – which was, however, an entirely natural result of large corporations protecting themselves against financial embarrassment, as required by their legal duty to shareholders. As it happened, the stockholders’ worst fears did not materialize; demand for mobile phones and PlayStation2s remained manageable, and by October 2001 the price of tantalum ores had fallen back to its early-2000 level.

WE NEED A NEW WAY TO MEASURE VALUE

The above is just part of a very big picture indeed, of a positionalized, and therefore intrinsically high-impact economic system, which is thoroughly globalized. It is a system shot through with the most radical injustice – inflicted almost in its sleep by nothing nastier than scrupulous cost accountancy and fiduciary diligence, within a framework of national and international law constructed on the premise of strong money and property rights.

Money pretends to offer a reliable measure of the value of things but clearly does not. Is an objective measure of value and harm even possible? The environmental impacts, which happen in tandem with the human ones, can at least be quantified in hard numbers, using the concept of ‘entropy’ (in this context, the degree to which materials become irretrievably mixed together, so that more work is needed to sort them out and make them usable). Timothy Gutowski, one of those developing this method, draws attention to the steady decline in the rate at which materials are recycled, even as they become harder to obtain:

Overall, the trends show an apparent remarkable reduction in the recyclability of products that is due primarily to the greater material mixing. Given the rather significant resources devoted to developing complex material mixtures for products compared with the rather modest resources focused on how to recapture these materials, it appears that there is reason for concern.18

Gavin Bridge, mentioned earlier in connection with the ‘Jevons Paradox’, says that:

The variety of elements on electronic circuit boards, for example, has increased from 11 in the 1980s to potentially over 60 in the 2000s as manufacturers have sought to boost product value and performance… Even mature technologies like telephones can use over 40 different materials: contemporary mobile phones, for example, can contain 17 different metals, only 4 of which – gold, silver, palladium, copper – are currently recycled.19

To put the above in context, I was once told by a former employee of the UK’s General Post Office (GPO) that their old, Type 332, rotary-dial telephone – featured in Alfred Hitchcock’s Dial M for Murder and innumerable other British films between the mid-1930s and the 1960s – contained just five different materials, and was 100-per-cent recyclable. It was also not a commodity: it belonged to the GPO and remained its responsibility throughout its life.

Figures published by World Resources International (WRI) have shown a more than fourfold increase since 1960 in the world consumption of four basic metals: nickel, chromium, zinc and copper, with a sudden increase in usage of copper between 1995 and 2000 (from around two million tonnes per year to around five million tonnes). For most materials, this is despite the fact that they have become more difficult and need more energy to extract, because the ‘low-hanging fruit’ – the richest, most accessible deposits – were used up long ago. And, as the extraction rate mounts, more and more of what’s extracted is going straight back into the soil, rivers, seas and atmosphere as waste. WRI reported in 2000 that ‘Between half and two-thirds of material inputs to industrial economies returned to the environment within a year’.20

The amounts of metals produced have increased dramatically, even as major deposits are exhausted and the quality of remaining ores diminishes. This necessarily implies unprecedented levels of pollution and waste, plus greater oppression in the areas where production takes place, and louder justificatory music in the places where the end products are consumed.

The drive to keep costs down mandates tighter controls on labor – labor is one of the very few factors in the production equation from which managers and entrepreneurs can squeeze extra profit when costs rise or prices fall. Gavin Bridge remarks on the ‘stark contrasts… between the hyper-mobility of exported resources and the geographical and social immobility of many people living and working in the same space’.21

ENTROPY: MEASURING WHAT’S POSSIBLE

Conventional economics doesn’t capture all of the economy’s costs and, according to its critics, it isn’t meant to. The Harvard economist Stephen Marglin argued in his 2008 book The Dismal Science that mainstream economics has always been the servant of powerful interests, which naturally want reassurance that what they are doing is for the best, and don’t wish to hear otherwise. Mainstream economics, he writes, ‘has been shaped by an agenda focused on showing that markets are good for people rather than on discovering how markets actually work.’22 So economists still maintain that money (as defined by the banking and finance community) is the best measure of the human value of economic activities – even though we see more and more examples every day of environments and lives ruined by the pursuit of monetary profit.

Value is certainly a tricky thing to measure, but ultimately everything has material cost, whether a monetary price is attached to it or not. Eric Williams used a ‘life-cycle analysis’ approach to work out what microprocessors really cost in terms of materials and energy (and the materials used to produce the energy). This involves tracking down those very things that a capitalist firm must externalize to survive, exposing much larger externalities and knock-on costs than anyone suspected, and producing a bill of costs that corresponds closely with the observed human and environmental impacts.

For example, the full life-cycle cost of the LED lights referred to by ‘green growth’ enthusiasts like Chris Huhne (mentioned in Chapter 5) is far greater than it seems because so much energy is used in their manufacture. They are semiconductor devices, made in the same high-energy fabrication plants as computer chips. Even so, they could in principle have only about 25 per cent of the lifetime energy cost of old-fashioned incandescent lighting. However, this is without considering why they are manufactured in Special Economic Zones in the Global South rather than in the English home counties or the Berkshire hills of Massachusetts.

LEDs (and microchips) are manufactured in these distant places because it became too costly and too problematic to carry on making semiconductors in more prosperous places. In the SEZs, and in poorer places generally, costs do not have to be accounted for so rigorously, which implies further external costs that can take some tracking down. In addition to the costs that go with semiconductor manufacture, substantial but hard-to-measure costs are certainly incurred in the factories where the LEDs are assembled into saleable products like torches and car tail-lights. Some costs will be almost indecent to quantify in money terms: illness, early death, birth defects and so on. Finally, because the LEDs have been made so cheaply that quite powerful ones can be given away as promotional gifts, they end up being used far more than the previous generation of lighting. So even though an individual LED light may still use less energy over its lifetime than its predecessor, the cumulative effect is an increase on energy used for lighting, which will show up on the ‘environmental bottom line’ no matter how hard firms work to hide the impacts that come in between. A number of reports and conference papers have tackled this complicated cost-accountancy problem.23

MAXWELL’S DEMON: THE SPOILER IN THE GREEN GROWTH DREAM

An additional way of getting a handle on the problem is to recognize the fundamental laws of physics that are involved.

The key here is the Second Law of Thermodynamics (also known as ‘The Entropy Law’ and ‘The Universal Law’). This surprisingly recent law (1824) states what ought to be obvious: in effect, that you cannot get something for nothing, or have your cake and eat it. The Second Law is like the ‘bad news’ part of an ‘I’ve got good news and I’ve got bad news’ joke: whereas the First Law says energy cannot be destroyed, whatever you do to it, the Second Law says that, while that may be true, the energy may be no earthly use to you after you’ve put it to use – so use it carefully. Perpetual-motion machines cannot exist in our universe.

As the Scottish physicist James Clerk Maxwell, put it: ‘The second law of thermodynamics has the same degree of truth as the statement that if you throw a tumblerful of water into the sea, you cannot get the same tumblerful of water out again’.24

The example that the Parisian engineer Sadi Carnot had in mind when he defined the Law was the heat that goes into a steam engine’s boiler: it may still be there after it’s passed through the engine’s pistons, wafting around in the atmosphere, and may linger for a while in the locomotive’s metalwork, but you try doing anything useful with it! If you want to perform the same trick again you must get some more coal. The energy can be retrieved by clever engineering (as James Watt had realized not long before, and as power stations do with their enormous cooling towers) but it all takes extra work – which also consumes energy, and dissipates materials into the environment in the same way that the engine’s steam melts into the atmosphere. It has to be done with care and consideration.

Maxwell proved in the 1860s that gases (including air and steam) are composed of molecules bouncing around at different speeds, and wondered whether there was a way of using this insight to prove the Second Law, and rule out the possibility of perpetual-motion machines once and for all.

In 1871, in a letter to a friend, Maxwell described a small, imaginary being ‘whose faculties are so sharpened that he can follow every molecule in its course’. This imaginary being was enclosed in a box full of air: a collection of randomly moving molecules of nitrogen, oxygen and so on, warmer ones moving quickly, cooler ones moving slowly. The box was divided by a wall in which there was a trap door. Using the trap-door, Maxwell’s creature would let the fast-moving, hot molecules go into one compartment, slower, cold ones into the other. Eventually this would turn the box into something like the piston cylinder in a steam locomotive: hot (fast-moving) gas, creating pressure on one side; cold, slow-moving gas with lower pressure on the other: the basic requirement for a machine capable of work. Maxwell’s question was: does reality allow this, and if not, why not?

This creature soon became known as ‘Maxwell’s Demon’, and it remained a well-loved riddle of physics until, in 1929, Einstein’s friend Leo Szilard worked out why the demon would not be able to do the job. His solution hinged on the new understanding of information (previously assumed to be something immaterial, outside physics). Szilard realized that the demon needs at least to be able to recognize the different molecules before it can do its work: that is, it needs information, and this can’t be left out of the calculation. And all information, apparently, comes at a price, even the modest information the demon needs. When the sums are done, and even allowing for the demon to be infinitesimally small and entirely without any physical needs of its own, the cost of the information it needs to identify the fast and slow molecules just outbalances the value of any energy it harvests.25

PUNCTURING THE WEIGHTLESS ECONOMISTS

Maxwell’s Demon is, therefore, a challenge and a warning to capital, and ought to have introduced caution into the idea of a ‘frictionless’, ‘weightless’ information economy. Yet the information economy still operates on the old assumption that information costs nothing. Now we are paying the price, as we will see in Chapter 10.

Even as the ‘weightless economy’ was preparing for take-off, in 1990, the Hungarian information theorist Tom Stonier was teasing out the implications of Szilard’s proof. In a fascinating but little-read book, Information and the Internal Structure of the Universe, Stonier showed that information has a physical reality – and that structure and information are counterparts of each other.26

When structure (for example, the latticework of an ice crystal) is lost through melting, so is all the information implicit in that structure, which is not just something the human aesthetic sense has projected onto it: it is real, even if it is small. Stonier showed that one joule of energy (a tiny amount) corresponds to 1023 bits of information: a 1 followed by 23 zeros, or 100 zettabytes, an astronomical number. The fact that so much information can be had for so little energy sounds like good news for the weightless economists. But first, that is without the cost of any technology for handling the information, and that could be exponentially greater. And second, the converse is also true: even a tiny expenditure of energy necessarily implies the loss of an astronomical amount of information and structure, and it will take exponentially more information (involving yet more work and entropy) to put that structure back together.

This is not what the proponents of weightlessness want to hear, but it ought to give pause to those, like the very rich and influential Matt Ridley, who argue that waste is nothing to worry about because all the valuable materials we have wasted are still there, in the slurry-tanks and spoil heaps and waterways, ready to be harvested by clever new technologies. But all new technologies cost energy, structure and information, no matter how ingenious they seem. Ridley writes (in a piece for the Washington Post, entitled ‘Why most resources don’t run out’):

Take phosphorus, an element vital to agricultural fertility. The richest phosphate mines, such as on the island of Nauru in the South Pacific, are all but exhausted. Does that mean the world is running out? No: There are extensive lower-grade deposits, and if we get desperate, all the phosphorus atoms put into the ground over past centuries still exist, especially in the mud of estuaries. It’s just a matter of concentrating them again.27

I love that ‘just’. And who is ‘we’? It is of course conceivable that there might be some micro-organism somewhere that will do this job for ‘us’ but somebody will have to find it (and even Maxwell’s Demon would need some resources to do that) and the possibility of knock-on effects should not be ruled out. The overwhelming probability is that it would be exponentially more costly than the original practice of digging guano.

Well-structured things are said (by the Second Law) to have low entropy; disrupted things have high entropy, and high-entropy things are generally useless. The tendency of phenomena in the greater universe is towards greater entropy, as the heat and light of the stars dissipates. At some point in the future the universe faces ‘heat death’: the same temperature everywhere, and physically uniform in all directions. There is no need to expedite this process, in one of the possibly very rare parts of the universe where entropy has gone into reverse – but that is what the dynamic of capitalism does, and what its ideologies promote.

Low entropy has to be created by work – which for humans means also creating heat, and therefore overall entropy, so we do it with care and respect. The extremely low entropy represented by an ingot of pure silicon demands the greatest respect.

The real source of low entropy is life itself. Life, said the Austrian physicist Erwin Schrödinger in 1943, is a system for locally reversing the Entropy Law: for transforming simple chemicals and water into complex and ever more complex, highly structured organisms, and even complex geologies derived from those organisms. He called the process ‘negative entropy’; others shortened it to ‘negentropy’, which is the term people now use.28

Life can do this because the earth is an ‘open system’ – open to assistance from the sun, a tiny fraction of whose incoming energy (less than 0.1 per cent29) is harnessed by plants, to create whole landscapes of order and complexity out of simple minerals and sunlight.

Evolution, says Schrödinger, is the result of nature’s progressive refinement of its negentropic techniques, creating more and yet more complex and self-sustaining order through ever-more parsimonious use of the same basic resources. Humanity and its technologies ought to be neither more nor less than a continuation of this process – as Brian Arthur’s definition of technology implies. And indeed most human communities for most of our history have, by and large, been negentropic. When humans settle in and develop a new habitat, they can and usually do increase the amount of order that it contains, building complex plant communities for their own uses, in places where little grew before. The results are just as wonderful as when lichen and mosses colonize stark, volcanic landscapes, bringing richness where previously there was only sameness. This was the situation in Ladakh, as described by Helena Norberg-Hodge (Chapter 4); also in millions of back gardens and allotments: oases of diversity and detail, contrasting sharply with everything around them.

All of this breaks down when entrenched inequality and hierarchical societies make their appearance. All the careful prior work of nature and humanity is suddenly fuel for quick self-aggrandizement, and its power to do this increases with the efficiency of the available technology, so that the more parsimonious the technology, the worse the consequences, culminating in the situation we now face.

Timothy Gutowski and his colleagues (2009) write that:

the intensity of materials and energy used per unit of mass of material processed (measured either as specific energy or exergy) has increased by at least six orders of magnitude over the past several decades. The increase of material/energy intensity use has been primarily a consequence of the introduction of new manufacturing processes, rather than changes in traditional technologies. This phenomenon has been driven by the desire for precise small-scale devices and product features and enabled by stable and declining material and energy prices over this period.30

So the alleged super-efficiency of electronic devices is only obtained at a huge and escalating physical cost; and (as we will see shortly) is largely propaganda: the now-dominant type of computer has been described with good reason as ‘the least efficient machine that humans have ever built’.31

1    Diane Coyle, The Weightless World, MIT Press, 1998.

2    Bob Norton & Cathy Smith, Understanding the Virtual Organization, Barrons Educational, 1998. For a lucid, early critique of this wave of books see Ursula Huws, Material World, Socialist Register, 1999, nin.tl/Huws1999

3    Frances Cairncross, The Death of Distance: How the Communications Revolution will Change our Lives, Harvard Business School Press, Boston, 1997.

4    Eric D Williams, Robert U Ayres & Miriam Heller, ‘The 1.7 Kilogram Microchip’, Environmental Science & Technology 36 (24),2002, pp 5504–5510, nin.tl/microchipweight

5    Kuehr & Williams, 2003, cited in D Pellow, D Sonnenfeld & T Smith, Challenging the Chip, Temple University Press, 2006, p 205.

6    J Mazurek, Making Microchips, MIT Press, 1999; Ted Smith, David Allan Sonnenfeld & David N Pellow, Challenging the Chip, Temple University Press, 2006, nin.tl/ChallengingtheChip; E Grossman, High Tech Trash, Island Press, 2006.

7    J Turley, ‘Semiconductors on a Train’, Embedded Systems Design, 2 Aug 2006, nin.tl/semiconductorstrain

8    Mazurek, op cit, p 7.

9    Kiera Butler, ‘Your Smartphone’s Dirty, Radioactive Secret’, Mother Jones, Dec 2012, nin.tl/MJsmartphonesecret Accessed 6 April 2013.

10  KMG Astill, Clean up your computer, Cafod, Jan 2004, nin.tl/cleanupcomputer

11  Antony Leather, ‘Most Computers Replaced after 4.5 Years’, Bit-Tech, 12 May 2011, nin.tl/bit-techreplace Accessed 16 Feb 2014.

12  Grossman, op cit; J Puckett & TC Smith, ‘Exporting harm’, Basel Action Network, 2002.

13  Grossman, op cit, p 79.

14  Grossman, op cit; Mazurek, op cit; Ted Smith et al, op cit; N Dyer-Witheford & GD De Peuter, Games of Empire, University of Minnesota Press, 2009; Astill, op cit.

15  Hsiao-Hung Pai, Scattered Sand, Verso, 2012.

16  J Miller, ‘Congo’s Tin Soldiers’, 30 Jun 2005, nin.tl/congotin

17  Michael Wallace Nest, Coltan, Polity, 2011.

18  Bhavik R Bakshi & Timothy G Gutowski, Thermodynamics and the Destruction of Resources, Cambridge University Press, 2011, p 125.

19  G Bridge, ‘Material Worlds: Natural Resources, Resource Geography and the Material Economy’,. Geography Compass, 3(3), 2009, pp.1217-1244.

20  Ibid.

21  Ibid.

22  SA Marglin, The Dismal Science, Harvard University Press, 2008, p 3.

23  Navigant Consulting, Life-Cycle Assessment of Energy and Environmental Impacts of LED Lighting Products Part I, US Department of Energy, Feb 2012. See also Timothy G. Gutowski, ‘Manufacturing and the Science of Sustainability’, Massachusetts Institute of Technology, 2011.

24  Harvey S Leff & Andrew F Rex, Maxwell’s Demon: Entropy, Information, Computing, Adam Hilger, 1990, p 39.

25  Leo Szilard, ‘On the Decrease of Entropy in a Thermodynamic System by the Intervention of Intelligent Beings’, 1929. Published as Chapter 3.1 of Leff and Rex, 1990, op cit.

26  T Stonier, Information and the Internal Structure of the Universe, Springer-Verlag, 1990.

27  Matt Ridley, ‘Why Most Resources Don’t Run out’, 30 April 2014, nin.tl/Ridleyresources

28  In his 1943 book What is Life? Schrödinger called it ‘negative entropy’; Leon Brillouin shortened it to ‘negentropy’ in his 1949 essay ‘Life, thermodynamics and cybernetics’, published as Chapter 2.5 of Leff and Rex 1990, op cit.

29  The total solar energy absorbed by Earth’s atmosphere, oceans and land masses is approximately 3,850,000 exajoules (EJ) per year. In 2002, this was more energy in one hour than the world used in one year. Photosynthesis captures approximately 3,000 EJ per year in biomass.

30  TG Gutowski, MS Branham, Jeffrey B Dahmus, AJ Jones, A Thiriez & DP Sekulic, ‘Thermodynamic Analysis of Resources Used in Manufacturing Processes’, Environmental Science and Technology 43, no 5, 2009, pp 1584-1590.

31  G Dyson, Turing’s Cathedral: The Origins of the Digital Universe, Knopf Doubleday, 2012.