CHAPTER 12

DISPLACEMENT

Alan Hamilton turned up on the doorstep of my student share house in Redfern on 26 January 1988. It was the Bicentennial, marking 200 years since the First Fleet arrived and began the process of dispossessing Indigenous Australians of their lands. Prime Minister Bob Hawke led the national celebrations as a replica flotilla re-enacted that first entry into the remarkable harbour. There were inspirational speeches and Sydney’s first really indulgent firework display. Closer to home, my share house had joined an Invasion Day March from Redfern to the city, a recognition that not everyone was joining the Celebration of a Nation, and that what appeared to be a glorious procession into history had come at an immeasurable cost to an entire people.

Al rocked up as we were debriefing a hard day’s activism on the front porch. He had hitchhiked up to Sydney to study philosophy and seen our Rooms to Rent ad in the paper. Ours was by far the cheapest on offer, thirty bucks if I remember, but the room was so small it could only fit a single bed and not much else. This didn’t faze Al because he was used to travelling light. Originally from rural New Zealand, he skipped town at 16 and headed across the ditch, picking up odd jobs and hauling himself into uni. He had the outlook of the habitual itinerant, adapting to our six-bedroom terrace and the surrounding community of misfits as though he had always been a part of it.

When he got through uni, Al sought out a job with the unions but he lacked the network to get a start. Instead he became an employer rep in the manufacturing sector, learning how to nudge workers into embracing the changes that no-one could really avoid. It was the early nineties and the economy was opening up, which meant the end of industry tariffs, which in turn made it harder to protect local businesses from cheaper overseas imports. Whether you were building petrol pumps or motor cars, the conversation was always the same: how can we do things smarter to keep local business in the game? What are you prepared to change and what do you expect in return? The battles got heated at times, but more often than not it was a joint mission – employers, unions, employees – everyone had something invested in making things work.

Al realised he was good at bringing people with him and he started building a career, a family too. He married and started a brood, settling in Brisbane, his wife’s hometown. In 2000 he stumbled into the global computer giant IBM, which had just won a massive contract with the newly privatised Telstra. Al’s job was to shift Telstra’s entire back office operations out of the company and into IBM, where the same staff would provide the same service. It was a straightforward proposition, shift across and you keep all your conditions, all your entitlements, plus access to the latest computers with one of the global leaders. New employees would be hired on IBM’s conditions, not Telstra’s more generous ones, but that was not the point of the exercise. The real purpose was that Telstra would no longer need to buy computers from IBM and hire workers to operate them. Now they would pay IBM a monthly fee to provide IT as a service – computers, software, workers, all in the one transaction. No longer paying a fee for a product but a fee for a service.

This simple proposition drove outsourcing through Australian industry for the next decade across the lucrative financial services industry, insurance, telecommunications, anywhere that employed a mass of workers to run the business machine. Instead of having offices full of people working on ageing computers, you could have the latest technology and best-trained staff on tap. It was a profound shift in the idea of what a company was. Instead of being an institution that employed lots of people and invested in capital equipment, it would become more of a shell, managing a series of contracts that would be provided by specialist outsiders. The logic of the transaction was that it reduced the fixed costs for a company and gave it access to the best technology, technology that was not only driving business costs down but also driving better service.

As the internet became faster and more secure, there wasn’t even the need to have the workers in Australia anymore. You could manage a system from anywhere, and ‘anywhere’ had much lower wage bills. Help desks, data processing, even customer call centres could be housed in developing countries where the smartest young bilingual workers could deliver the service for a fraction of the cost but still feel themselves to be on a great wicket. Sure, there were times when consumers would push back, complaining customer service from a foreign person was inferior to that provided by a local, but you could bring customer-facing service back to a regional call centre in Bega and still manage the bulk of your back office in Bangalore.

Over the next 15 years Al would become a master of outsourcing. He estimates he transitioned more than 40,000 jobs into and out of IBM. And he honestly believed that what he was doing was good for everyone: companies would be able to offer cheaper products, workers in developing nations would get the chance of a life-changing job, and locals who were doing pretty basic process work would move up the occupational food chain, finding the smarter, more creative jobs that the internet would inevitably create.

‘I drank it all up,’ he concedes. ‘The thing is when you are doing this sort of work you are totally immersed in the specifics of the project and you don’t get the space to see the bigger picture.’

#

There is nothing new – nor innately wrong – about jobs being replaced by technology. The quality of life we enjoy today is the product of waves of change that have replaced bone-crushing, soul-destroying manual labour with mechanical processes. Tractors replaced field hands, factories replaced cottage industries, then automated assembly lines transformed factories, all waves of automation that changed the sorts of work most people do. There was always a deal implicit in the change, that there would be something better on offer if you could acquire the skills to do it. From farm labouring to factory work, from factory work to services, from services to higher-level problem-solving, the extra wealth from the smarter use of human resources would create demand for more complex goods and services.

Economic development isn’t just theory. The lived experience has been that more efficient processes create wealth that is shared to create demand for services that are delivered by workers in more complex and better-paid jobs. The benefits don’t just flow seamlessly, though. Initial waves of technological change are always followed by disruption, reform, even revolution, as people demand control over the impact of change. Following the industrial change of the nineteenth century, laws were needed to ensure the new workplaces were safe, that children were protected and, later still, in the twentieth century, that nature wasn’t destroyed. Unions emerged to ensure decent wages and taxes on profits shared the benefits of the growing wealth. But once those ground rules were in place, technological change drove up standards of living, life expectancy and general enjoyment of life.

The same optimistic model applied at the birth of computing. By automating processes for handling and distributing information, it became much easier to complete basic tasks, store information, capture and transmit intellectual property. Typists, phone operators, clerks and administrative assistants all found most of their work automated, driving massive efficiencies in big government departments and corporate back office processes. The cost of doing business was greatly reduced and if workers were no longer needed in one industry, they could find jobs elsewhere. The cheaper cost base also drove down the price of products and allowed governments to shrink the public sector and lower taxes, which in turn left more money in the public’s pockets, creating demand for new services that the displaced workers could deliver in their new jobs.

The web followed a similar trajectory over its first two decades. Instant connectivity opened up a whole world of smart and creative work that did not previously exist. The web provided new and diverse ways to make a living, from those who mastered skills like coding and design, to those with the creativity to develop products that could be traded online. Even those like me, who found new ways to tell stories.

The price of these new opportunities was the destruction of jobs that had managed the distribution of scarce information. The job usurpers of the twenty-first century are a roll call of the Big Tech companies that either challenged workers’ utility or their employers’ business models. Retail workers have been displaced by Amazon and other online retailers, video stores by Netflix, record shops by iTunes and then Spotify. Bank tellers gave way first to ATMs and then online banking, researchers to Google, stockbrokers to online trading platforms, travel agents to TripAdvisor and the wretched journalists to the Facebook algorithm. Where jobs would disappear has not always been predictable, nor inevitable. Postal workers, for example, seemed doomed with the advent of email, but have found a second life with the rise in online shopping that has itself undermined the need for retail workers in real shops.

Through these waves of change the deal has held that money generated from efficiencies would be invested in new industries that would create new jobs. Robert Reich’s symbolic-analysts would inherit the earth, armed with the skills to identify and solve society’s big problems. Education would be the key to success in this connected world and the centrepiece of the grand bargain between government and the people. Develop the skills and you would have the opportunity for a stable and interesting life. Maybe the job won’t be as secure as those in the past, but you could charge a higher hourly rate and, hey, who wants to be in the same job forever anyway?

But something profound started to happen about five years ago. Al noticed that the jobs he was outsourcing were no longer being replaced up the value chain. They were just disappearing. He started talking to friends and colleagues with kids ready to enter the workplace and, regardless of their qualifications, young people were struggling to get a foothold in anything resembling a career. He looked up from his next project and started to see how opportunities had stopped automatically cascading up. Then it started happening inside his own company.

#

In 2014 Al was appointed to an internal IBM team given the job of selling its Contact Centre business, a global operation of 50,000 business process employees that Al had helped create. The staff were not IT people but rather health insurance claim processors, bank loan originators and telco and utilities call centre staff. The business was profitable and effective, but even though the workers were highly skilled, their work followed particular lines of logic that could be replicated by computers. The process was simple: just log every customer interaction and build a massive dataset, then program a computer algorithm to understand the questions and respond with the correct chain of answers.

Once customer care becomes automated the service may not be quite as good as a human to human one, but it won’t be that much worse, either. It will be so much cheaper, though, that the business case will be irresistible: a computer program that can replace 50,000 workers, never take leave or sick days, doesn’t need meal breaks – or any break at all for that matter. Ever. How could a human ever compete?

As Al worked through the transition, he saw similar developments emerging across the industry. All the new technology being developed had a simple purpose: to save money by replacing humans. Not changing their roles, but replacing them altogether as an end in itself. At the heart of this shift was a new application that would redefine the web, the creation of automated processes that could replicate human thought.

Artificial intelligence, or AI as it’s become known, is the automation of complex decisions based on the collection and analysis of data. In the past, computers were programmed to undertake defined tasks that could be memorised and repeated. These tasks could be relatively complex as long as they had been predicted by the programmer and factored into a chain of decisions. But the new wave of programming created more complex algorithms that actually allowed computers to observe and learn as they went, rather than simply repeat predetermined functions.

They were called neural networks, feedback loops that processed information both forward and backward, observing the effect of each movement, learning from errors and modifying their processes, learning the same way we now think the human brain does. Once a network had been trained with enough examples, it could receive new information, make sense of it and apply it, even though it had never been programmed to deal with that particular circumstance. Computers were learning to ‘think’.

This is not blue-sky R&D. Major tech companies like Google and Al’s old company, IBM, are already deep into artificial intelligence projects. Some of the early learning has already been applied, such as the so-called ‘bots’ that replaced the people in Al’s Contact Centre, and programs that can deal with a huge number of basic customer inquiries based on previous customer interactions. Each exchange builds the database of responses, with complex questions referred up the line to a human who will provide the correct information and add it to the database, ensuring they will never have to answer that question again. A living engine that, with a voice recognition overlay, can provide a personal phone experience.

Predicting the impact this technology will have on jobs is still guesswork. In a 2015 report, the Committee for Economic Development in Australia predicted five million jobs, or 40 per cent of the workforce, would be replaced by automation in the next two decades. Other studies, typically commissioned by tech and business lobbies, assert that the impact will be less drastic. In truth they are all really putting their fingers in the air. What we do know is that any job where decisions follow a logical thought progression is ripe for replication.

‘If any decision that you make requires less than two or three seconds of thought,’ Al says, ‘then AI can probably already do it.’ It’s relatively easy to write a predictive algorithm that can reliably get through that step, and if the next step in a chain of reasoning is a similar one, then it’s likely that AI can complete that step too and then every step in the chain to quickly arrive at what we would ordinarily think of as a complex decision. ‘If you think about any task that you do, if it can be broken down into a lot of sub-tasks, all of which require about a couple of seconds of reasoning before a decision can be made, then an AI can probably be trained to do it.’

Now, think about your job for a second. If you work in any sort of job with a routine set of actions, it is at risk. Driverless cars, drones and robotic pods will be able to move people and goods. Programmable devices will be able to cook and serve food and drinks, clear tables, clean things. Any work involving surveillance or monitoring is vulnerable – security, policing, the armed forces. Unlike previous waves of change, AI will also strike the top end of the labour market, the coveted professions that have been career destinations. Most accounting functions will be able to be replicated, as will most legal process work, from contracting to compliance. Finance will be hollowed out with most approval and authentication processes able to be automated. Complex engineering calculations will be able to be solved by robots. Already the stock market has been largely automated by programmatic trading. Even medicine won’t be immune, with GPs becoming more like data processors and national e-health initiatives taking this to an industrial scale until predictive algorithms will likely give a clearer readout of a diagnosis that a human expert.

Creative work may not be immune either. Some sports and finance journalism is already being carried out by programs that can tailor content to the observed interests of a user. In my world of political research, pre-programmed phone calls with recorded messages can construct a ‘robo-poll’ at the fraction of the cost of traditional human interactions. Robots are also being programmed to create visual art and compose original pieces of music that look and feel like the real thing to the untrained eye and ears. But this is just scratching the surface of what is to come.

In the celebrated ‘Go’ study, an AI program not only defeated the world champion of the complex three-dimensional territorial board game, it did it with a combination of moves that had never been tried before. That jump from programmed response to something approaching creative thought broadens the potential application of AI dramatically. If machines will not only make faster and more accurate decisions but take information and analyse it creatively, what point is there in a human worker at all?

It may only be in the caring professions, jobs requiring human to human contact like nursing and child care – jobs that have traditionally been given a low value in the economic system – where there will be a need for human workers at all. Then again, with the right algorithm, who can tell?

‘The thing that most people don’t appreciate is how fast this change is coming,’ Al warns me. ‘We usually think of things happening gradually, one step at a time, but technology has been moving exponentially for decades.’ Over the past 50 years the speed of data processing has doubled every two years. Moore’s law contends the trajectory is exponential rather than incremental, the pace too rapid to even comprehend, let alone absorb. Machine intelligence is not the start of the curve, it’s the acceleration of that curve.

What is the end point? When machines and artificial intelligence become so advanced that they begin to design and manufacture themselves, becoming self-improving and self-replicating.

‘The most optimistic technologists predict that this event will occur within the next 30 years,’ Al explains. That’s the point at which machines will ‘bootstrap’ themselves at such an accelerated rate, humans may be unable to control or predict what could happen next.

#

The resource fuelling this revolution is the data that captures our behaviour, often without us even knowing it. Already the amount being collected is measured in the quintillions, a term so large I’d never seen it applied to anything before. The capacity to collect this information is growing so fast that in the last two years 90 per cent of the world’s data was collected. We know Google and Facebook collect masses of data from users who offer their information in an unconscious deal in return for free access to their platforms. Supermarkets offer rewards in return for the right to track customer behaviour. Huge data companies like Experion acquire any companies that collect data, from credit ratings agencies to robopoll market research. Each of our online interactions has become part of the massive global data-harvesting enterprise.

But online activity is the tip of the data iceberg. Big companies are increasingly collecting massive banks of data from employees during their workdays. Amazon is a world-leader, rigging up its own workers with GPSs to monitor their movements and patenting an ultrasonic tracker that follows the movements of workers’ hands. Freelancer site Upwork is reported to monitor workers through webcams; a UK railway company has equipped workers with devices to manage their energy levels. Microsoft Outlook now monitors users’ email activity as a default setting on all Office 365 installs. Al gets a report every week on his own activity and, no doubt, so does his manager. While the initial fear may be that the company is monitoring its workers’ performance as a management tool, the greater risk is that it is helping them design their future robotic replacements.

Then there is the so-called ‘Internet of Things’, the data that will be collected through the digital footprints we generate through the smart technology on our phones, our cars, our fridges, our health records, our retail habits, the smart speakers that will control the devices we use in our homes, the public buildings we visit. Every step of every day we will be adding to the bank of human data. While these intrusions are marketed as services, the business model of most is to harvest and monetise the massive trove of information that is extracted as we embrace the next smart device in blissful ignorance.

‘For the first time in human history we’re now at the point where we can observe and understand a human’s qualities and frailties, mental and emotional, and exploit them at scale,’ Al laments. ‘It’s like nothing we have ever seen before.’ And because we have never seen it, it’s hardly surprising that we have no idea how to respond. And so the technology just rolls on over us as if we weren’t even here.

#

I’m sitting in the Great Hall at the University of Sydney to hear David Plouffe, one of the architects of Barack Obama’s presidency, describe his excellent adventures in political strategy. It is a much-hyped, well-attended affair, but it soon becomes apparent Plouffe isn’t here to talk about his prowess as head of Obama’s winning campaign in 2008. Rather he is spruiking his new boss, the Uber ride-sharing service. Plouffe talks a massive game. Uber is the economic game-changer. It can drop into any city on earth and create 20,000 new jobs! Now! Jobs just like that! And the greatest part? You could work whenever you want for as long or as little as you want!

The crowd of earnest lefties sucks it all in as I get angrier – and not just because we have been dragged in under false pretences. I see something sinister in his pitch. These are not real jobs, they are simply tasks undertaken for a floating fee, of which Uber clips the ticket. Uber is not the transformative new idea Plouffe is throwing about. It is actually a pretty crappy old idea: undermining (or ‘disrupting’ as the cool kids say) an existing industry by bypassing the network of regulations governing safety, licensing and workplace rights, including a decent wage.

Like a conquering power, Uber doesn’t recognise laws or regulations. It enters a new market and simply asserts there are no rules in place to stop it. Because it doesn’t employ any of its drivers, simply books them for each ride via its app, it’s not a taxi company, it’s a cool new gig economy ride-sharing service, so why would old-world taxi regulations apply? It will fight to avoid having to train drivers, to pay them super or ensure safety checks – after all, they’re not a taxi service. They will use their start-up warchest to hire the smartest lawyers and lobbyists and at some point they will concede a little, but not too much, ground, and accept some, but not too many, regulations. Thus they will colonise another market.

As Uber has grown I have watched more people I know become believers in the app, convinced it delivers a superior user experience. Who am I to say otherwise? I have refused to be seduced as a matter of principle. I talk to cabbies whose life savings were invested in a set of plates whose value is now in free fall and drivers who haunt empty streets and openly contemplate packing it in. The Uberites rationalise that the taxi business is crooked and Uber is just breaking the racket, and maybe they have a point there too. But my frustration is knowing that my friends are making selfish short-term choices with no long-term thought for the implications of their actions.

Replacing the taxi industry is only the start of Uber’s designs. Without asking its passengers, Uber is also using GPS to track every ride their drivers take; every lane-change, every rat-run, every passenger sweet spot. It’s no coincidence that Uber is currently one of the leading developers of driverless cars. The data they are collecting will be critical in developing the algorithms to power these vehicles. At this point the Uber drivers will go the way of the taxi drivers whose industry they ruined. What comes next is just as predictable. Uber will get richer and richer while providing fewer and fewer jobs. The wealth created will be hived off into a tax haven. The data collected further monetised. The company will invest in cool new ways to suck up more money by replacing more workers with algorithms. And they will be unstoppable because no-one will see them coming.

#

As the Contact Centre sale was finalised, Al decided it was time to displace himself too. It wasn’t just the outsourcing, it was the contracts he was working on that were all about giving governments new tools to observe, track and log the movements of their citizens. He took some time out to get his thoughts in order and went back to university to reacquaint himself with philosophy as he tried to make sense of the looming reckoning. That’s where he came up with the notion of ‘digital colonialism’, the idea that we are being conquered by technology firms like Uber that assert they are merely taking resources that are lying there underutilised; and in the process taking our jobs, taking our data, and setting their own rules because there’s nothing to stop them.

‘This new digital world is like a virgin landscape open for exploration and colonisation by intrepid and ambitious opportunists,’ Al says. ‘There are few laws and regulations and no national boundaries or even physical limitations. Tech companies have appropriated this digital world by deeming it to be terra nullius, an “empty land”.’

An IMF working paper entitled Should we fear the robot revolution? (The answer is yes) argues there is little reason for optimism that the jobs displaced will be replaced in other sectors. Using complex modelling, it can’t find a credible scenario that identifies where new jobs will emerge. It finds that while automation may be good for economic growth, it will be bad for equality. That’s the other exponential trendline running alongside the pace of technological change – the increasing concentration of wealth across western economies.

Like the European invaders in the so-called New World, Big Tech is thriving in this virgin wilderness, part of the fastest concentration of wealth in human history. Every year the percentage of global wealth concentrated in the top one per cent of the world’s population increases. According to Oxfam, the top one per cent hold 82 per cent of the world’s wealth today. The world’s richest eight individuals hold the same amount of wealth as the poorest 3.6 billion people. Year-on-year the share of national product going to capital over wages is increasing, the result of falling incomes as well as preferential tax treatment. The list of the wealthiest individuals on earth is a roll-call of the tech barons: Microsoft’s Bill Gates, Amazon’s Jeff Bezos, Facebook’s Mark Zuckerberg are all in the top ten. Big Tech has opted out of the tax system, setting up operations in low or no-tax jurisdictions while their owners manage their affairs through webs of trusts in tax havens.

That’s the reality behind the cool brands that run the web. They suck money out of economies, they destroy jobs and they undermine the tax bases of their host countries that fund the basic services a society needs to function. While AI will further consolidate their wealth, the erosion of jobs will push more and more people down to the bottom strata, to lower incomes and lower qualities of life. The tipping point will come when the vast majority of citizens no longer have the income required to spend on goods and services to keep the economy ticking over. Already flatlining wages in Australia are seen as a key economic risk. If, as the Committee for Economic Development in Australia predicts, millions lose their jobs, pushed into precarious work or onto welfare, this will only get worse.

Al points to the words of one man who really saw the future, Stephen Hawking. In 2015 he warned that it is not robots that are the threat to humans, but their owners. ‘If machines produce everything we need, the outcome will depend on how things are distributed,’ he warned in an online discussion hosted by Reddit. ‘Everyone can enjoy a life of luxurious leisure if the machine-produced wealth is shared, or most people can end up miserably poor if the machine-owners successfully lobby against wealth redistribution. So far, the trend seems to be toward the second option, with technology driving ever-increasing inequality.’

As the wealth gap increases, we are sowing the seeds for the divisive, grievance-driven politics that is rising to challenge the current global order. In democracies where the displaced still have some power at the ballot box, voters are already backing politicians who reject the established system. The appearance of Trump and Sanders as the standard-bearers of previously centrist parties is one example. Britain’s vote to leave the European Union is another. The rise of right-wing populists across the globe vowing to reduce immigration and close national borders, to turn inwards and protect people from change that they can’t comprehend, all happens with the coercive support of the same technology that they seek to protect their citizens from.

But they are all missing the point. The borders that really need protecting are the invisible digital ones, the borders that define what the web is and how it should be allowed to develop. Unlike physical borders, these are undefined and unprotected. We still haven’t thought through what rights we have – or want – over the collection of our data. There are no ethical standards over the development of AI, no regulations to create frameworks, just the can-do-anything attitude that has driven the development of the web through its first two decades. This is not about smoothing out some rough edges. The web has become a wrecking ball indiscriminately knocking our world apart with no regard for the consequences.

It is becoming clearer to me as I write that we urgently need some sort of assertion of human sovereignty if we are going to have any influence over where this uncontrolled, ferocious surge in technology is to take us next.