Chapter Five

IT’S NOT ALL ABOUT ‘US’

In 1994, I spent a month in Zimbabwe with my school cricket team. Our final game was at a festival in Harare, against a school called Plumtree. Their star player was one Henry Olonga, a fast bowler who the following year would become the first black player to represent Zimbabwe at Test level. He was not much of a batsman by professional standards, but he still managed to score a century against us, smashing our hapless attack for boundary after boundary. His bowling was something else entirely. I remember watching the Plumtree team take the field and wondering why their wicketkeeper was standing halfway between the stumps and the boundary. When he was joined by a slip cordon, I realised that was where they were expecting to be able to take the ball. All I saw of Olonga’s first delivery was a puff of dust from the pitch, before the ball hit the keeper’s gloves with a thud a fraction of a second later. He was seriously fast.

As I learned when I went out to bat, it wasn’t just his pace you had to worry about. The ball moved – off the pitch, but also in the air. Facing my first ball, I shaped to play a forward defensive just outside off stump, but it swung away from me and the keeper had to dive to take it on the leg side. A couple of wider deliveries kept moving after passing the bat – one so dramatically that it beat the keeper’s left hand and sped away for four byes, to Olonga’s great frustration. The final ball of the over was shorter. I got into a backward defensive position and hoped for the best. The ball flashed off the face of the bat and the next thing I knew my teammates were applauding as it raced over the third man boundary and onto the dirt road beyond. After this, Olonga decided he had had enough and took himself out of the attack.

That was the high point of my cricketing career, but Olonga was only just getting started. In the following eight years, he would play thirty Test matches and fifty one-day internationals for Zimbabwe, taking five-wicket hauls against India, Pakistan and England. He was selected for his country’s Cricket World Cup squad in 1996, 1999 and 2003.

It was in the last of these competitions, played in Kenya, South Africa and on home turf, that Olonga’s life as an international cricketer would come to an abrupt end. In a match against Namibia, he and his teammate Andy Flower, who would later go on to coach England, took the field wearing black armbands. They explained themselves in the following statement:

It is a great honour for us to take the field today to play for Zimbabwe in the World Cup. We feel privileged and proud to have been able to represent our country. We are, however, deeply distressed about what is taking place in Zimbabwe in the midst of the World Cup and we do not feel that we can take the field without indicating our feelings in a dignified manner and in keeping with the spirit of cricket.

We cannot in good conscience take to the field and ignore the fact that millions of our compatriots are starving, unemployed and oppressed. We are aware that hundreds of thousands of Zimbabweans may even die in the coming months through a combination of starvation, poverty and AIDS.

We are aware that many people have been unjustly imprisoned and tortured simply for expressing their opinions about what is happening in the country. We have heard a torrent of racist hate speech directed at minority groups. We are aware that thousands of Zimbabweans are routinely denied their right to freedom of expression. We are aware that people have been murdered, raped, beaten and had their homes destroyed because of their beliefs and that many of those responsible have not been prosecuted. We are also aware that many patriotic Zimbabweans oppose us even playing in the World Cup because of what is happening.

It is impossible to ignore what is happening in Zimbabwe. Although we are just professional cricketers, we do have a conscience and feelings. We believe that if we remain silent that will be taken as a sign that either we do not care or we condone what is happening in Zimbabwe. We believe it’s important to stand up for what is right.

We have struggled to think of an action that would be appropriate and that would not demean the game we love so much. We have decided that we should act alone without other members of the team being involved because our decision is deeply personal and we did not want to use our senior status to unfairly influence more junior members of the squad. We would like to stress that we greatly respect the ICC and are grateful for all the hard work it has done in bringing the World Cup to Zimbabwe.

In all the circumstances we have decided that we will each wear a black armband for the duration of the World Cup. In doing so we are mourning the death of democracy in our beloved Zimbabwe. In doing so we are making a silent plea to those responsible to stop the abuse of human rights in Zimbabwe. In so doing we pray that our small action may help restore sanity and dignity to our Nation.

To criticise the authoritarian regime of Zimbabwe’s president Robert Mugabe so directly and publicly was an act of great bravery that brought serious consequences. At thirty-four, Flower was nearing the end of his international career and already had a contract with Essex County Cricket Club in the UK, but Olonga was only twenty-six and had no plans to leave Zimbabwe. Immediately ostracised by the country’s cricket establishment, he was barred from even boarding the team bus. He began to receive death threats. The minister of information, Jonathan Moyo, publicly called him an ‘Uncle Tom’ with ‘a black skin and a white mask’. The secret police let it be known that they were sending plain-clothes officers to Zimbabwe’s final game of the World Cup, with the intention of arresting Olonga. They hinted that he would be charged with treason, which carried the death penalty, or ‘taken care of’ in some other way. With only a sports holdall and the cricket gear he was wearing, Olonga went into hiding in South Africa, before fleeing to the UK on a plane ticket gifted to him by a sympathetic stranger. He had escaped with his life, but would never play international cricket again.

What does this have to do with data and digital technology? I want to highlight how different the stakes can be depending on where someone happens to grow up. In many respects, Henry Olonga and I had similar childhoods. We both had professional parents who divorced when we were children. We both went to Christian boarding schools with great facilities and staff who valued and rewarded achievement in sport, theatre, art and music. We were sufficiently alike for our paths to cross on that cricket field in Harare in 1994. However, what was completely different about us was our relationship to the state.

At the time of Olonga and Flower’s protest, I was living in a small town in Buckinghamshire and working in the digital marketing department of a bank. I’d recently been put at risk of redundancy, along with several hundred colleagues. I wasn’t sure whether I wanted to relocate for a job elsewhere in the company, or accept a severance payment and move on. On the one hand, I disliked uncertainty; on the other, the economy was booming, my skills were in demand and the redundancy package would give me some savings. I was considering joining the staff union, and was weighing up whether the advice I’d get was worth the cost of membership. It wasn’t the best of times, but I was cautiously optimistic about the future.

Some possibilities didn’t ever enter my mind. It didn’t occur to me that hyperinflation might render my redundancy package worthless overnight or that the job market might suddenly collapse, leaving me with nowhere to work. I never imagined a scenario in which I’d be blacklisted for joining the union or forcibly evicted from my home by armed police. Similarly, when I’d spoiled my ballot in the previous general election to make a point about the shortcomings of the first-past-the-post system, not once did I consider that I might be threatened or harassed at the polling station.

That was because where I lived there were a lot of basic freedoms, guaranteed by the state, which I took for granted: personal autonomy, freedom of the press, freedom of association, the rule of law, fair elections, and so on. I suspect most people reading this book are in a similarly fortunate position, but coming of age in Zimbabwe, Henry Olonga could not count on these freedoms at all. At the time, the power of the state was being wielded by Mugabe’s party, ZANU-PF, in a very different way – not to expand freedom, but to exert control. On the pretext of land reform, farmers were driven from their property by government-backed militias. When the courts ruled that this expropriation was illegal, their powers were checked and judges were replaced. Political opposition by the Movement for Democratic Change, led by the trade unionist Morgan Tsvangirai, was suppressed and its supporters were persecuted and detained. Despite hyperinflation, mass unemployment, frequent power cuts, food shortages and the failure of the drinking water supply, not to mention international opprobrium, Mugabe’s position was never threatened. ZANU-PF controlled the army, the security services and the media, which combined with its willingness to use violence, made it incredibly effective at crushing dissent.

However, digital technology has begun to change things. The proliferation of mobile phones and the advent of social media has equipped Zimbabweans with tools to organise protest and mobilise support, as well as a space to express political grievances and demands that is beyond the government’s reach. In 2003, Henry Olonga and Andy Flower needed the spectacle of the Cricket World Cup, as well as connections to foreign journalists, to spread their message. But by 2016, Evan Mawarire was able to instigate a mass movement, #ThisFlag, with a four-minute video filmed on his phone and shared on Facebook. In it, the pastor and activist spoke directly to camera, reflecting on the values represented by the flag’s colours, and the ways in which they had been betrayed by the actions of Zimbabwe’s rulers. It went viral on social media, quickly building an online following for Mawarire that he used to campaign against government abuses. He and his supporters petitioned for corrupt officials to be dismissed, organising a general strike that brought the country to a standstill. A counter hashtag created by the government failed to gain traction, forcing them to resort to shutting down the internet. Mawarire has been repeatedly arrested and his family threatened, but the movement he initiated is not dependent on him: inspired by his example, student activists have used Twitter, Facebook and WhatsApp to co-ordinate marches on parliament under the banner #ThisGown, protesting against the government’s failure to deliver on promises of job creation.

The same technologies also enable more unruly forms of resistance. Baba Jukwa, an anonymous blogger who claims to have been a ZANU-PF insider, has turned their Facebook page into a space where Zimbabweans can openly criticise the government, contest propaganda, mock the ruling elite and defy repressive laws on public speech. It has exposed ZANU-PF plots to assassinate political opponents and rig votes, as well as mobilising ‘twitchfork’ mobs against politicians. By the time of the 2013 general election, Baba Jukwa had more Facebook followers than any Zimbabwean politician and almost three times as many as Mugabe. The government has arrested and charged individuals it suspects of being connected to Baba Jukwa, to no avail – digital technology means the strategies of coercion ZANU-PF relied on in the previous decade are no longer viable.

Zimbabwe is not the only place where this story is unfolding. During the Arab Spring, Facebook groups, Twitter hashtags and YouTube citizen journalism were instrumental in overthrowing authoritarian rulers in Tunisia, Egypt, Libya and Yemen. It’s also not the only story about shifts in the balance of power which digital technology has enabled. In China and Ethiopia, citizens’ internet activity is monitored to pre-empt organised dissent and consolidate governments’ power. At the same time, well-meaning online movements can backfire, as happened with #BringBackOurGirls in Nigeria; the campaign tried to force Boko Haram to release schoolgirls it had kidnapped, but ended up raising the terrorist group’s profile and strengthening its negotiating position with the country’s government.

Nevertheless, the politics of Zimbabwe is a helpful reminder that digital technology is not all about ‘us’. If we keep that country in mind, we might moderate the language we use when we discuss how the user interface design of devices and apps ‘makes us less free’. As Henry Olonga wrote about the impact of the armband protest, ‘maybe for others yet we challenged their own world view in a way that they could reflect on their own lives and revel in the freedoms they enjoy.’ What has happened in Zimbabwe should also make us think more rigorously about the technology policy solutions that are currently being proposed, and the critique surveillance capitalism theory makes of digital advertising-based business models.

For example, there are calls for tech companies to act more assertively against online harms like trolling and cyberbullying and insist that all users verify their identities. However, advocates of this can’t have thought about the implications for dissent in places like Zimbabwe, where the use of pseudonyms can be a matter of life and death. Without the ability to conceal one’s identity, speaking out against the Zimbabwean government on social media would carry risks as serious as those that were faced by Henry Olonga and Andy Flower.

There are also calls for big tech companies to be broken up on anti-monopoly grounds. This is an issue that we’ll discuss in more detail in Chapter Eight, but for now let’s observe that its proponents don’t seem to have considered the benefit for political organising of social media companies holding dominant market positions. Facebook Groups is not the only tool you can use to arrange events and communicate with a closed community, and nor is WhatsApp the only free, easy-to-use, end-to-end encrypted messaging service accessible to anyone with a mobile phone and an internet connection. However, they’re the best solution for organising in countries like Zimbabwe precisely because they have by far the largest number of users. As Zeynep Tufekci points out in her book Twitter and Tear Gas: The Power and Fragility of Networked Protest, ‘Facebook is crucial to many social movements around the world, and there is no real alternative because of its reach and scope’.

Calls for the digital advertising-based business models of Google, Facebook and Twitter to be banned persist, but the commentators and politicians arguing for such a ban forget that most of their users don’t live in affluent Western liberal democracies. For example, only two of Facebook’s top ten country markets by user numbers are in the West; its largest market by this measure is not the United States (210 million users), but India (300 million users). The Facebook population of Brazil (130 million) is more than three times that of the UK (40 million). And the Philippines (75 million) has significantly more Facebook users than France (33 million) and Germany (31 million) combined. If there was such a thing as ‘the average Facebook user’, their daily life would look more Zimbabwean than British.

Looked at in a global context, the effects of Facebook’s advertising-based business model are highly progressive – they transfer value away from the richest users and towards the poorest. All Facebook users receive identical services, but there are huge differences in the value of their attention and clicks. The result is that in 2018, $25.5 billion of value was effectively transferred from Facebook users in Europe, the US and Canada to users elsewhere in the world, as the below analysis of the company’s financial results demonstrates.

image

Transfer of value between Facebook users in different world regions in 2018.

In the language of political philosophy, this is a form of distributive justice. It satisfies two principles set out by the influential liberal philosopher John Rawls – that inequalities should only be permitted when they most benefit the least advantaged, and that people living under unfavourable political conditions should be assisted by people who live under more favourable ones.

Imagine what would happen if targeted digital advertising was outlawed and Facebook charged users a flat fee for access to its services. The amount that would be required from each user to match Facebook’s 2018 advertising revenues is about $25. If the annual subscription price was set at that level, hundreds of millions of people in the Global South would be priced out of the platform – it would effectively be thirty-four times more expensive for the average Indian than for the average American. In practice, Facebook might vary its subscription pricing by country as Netflix does, with annual charges ranging from $38 in Turkey to $148 in Denmark, but the breakdown of Netflix users by region suggests that Facebook usage outside the West would still plummet. In short, if Facebook has to move away from advertising to a subscription model, it will become like Netflix, Amazon Prime and Apple – a platform for the affluent. And that would be bad news in countries like Zimbabwe.

image

Comparison of Facebook, Netflix, Amazon Prime and Apple by region.

Another suggestion is that Facebook should adopt a so-called ‘freemium’ model. In this scenario, Facebook would continue to provide their current services for free, but would also make premium versions of the Facebook and Instagram apps available. Users would be able to opt out of ads and the associated data collection and targeting – subject to their financial resources, of course. Provided that the subscription revenue from the premium service subsidised the free services, the current benefits of redistribution would be preserved. However, such an arrangement would clearly be less egalitarian, reinforcing existing economic inequalities just as private alternatives to universal public services in healthcare or education do.

Apple’s Supply Chain

There is another serious problem with these policy proposals. In addition to having unintended consequences for non-Western users, they neglect issues of global justice that are more important than privacy. To paraphrase Cambridge politics professor David Runciman, it’s as if the moralising about data-driven ads has created a thick fog around policymakers. Not only does it make it more difficult for them to see what is really going on, it provides cover for those with an agenda they want to conceal.

Take Tim Cook, CEO of Apple. He is a vociferous champion of his customers’ privacy rights, even contesting demands from the FBI and the courts to decrypt mobile phone data for use in criminal investigations. At the same time, in presentations to investors and interviews with journalists, he is an outspoken critic of Facebook and Google’s business model and leadership decisions. It is, of course, his job to talk Apple up and its competitors down, but what issues do Cook’s claims of his company’s superiority regarding privacy distract attention from? What gets obscured by the fog?

The answer is Apple’s supply chain, which Cook built up in his previous role as chief operating officer. Although Apple is an American company with its headquarters in Cupertino, California, the assembly of its devices is outsourced – primarily to the Taiwanese company Foxconn. Before being shipped to Europe and North America, the majority of iPhones are put together in Foxconn’s gigantic factory in Longhua, Shenzhen, which employs more than 200,000 workers. By Western standards, conditions are harsh. Shifts typically last twelve hours, and tasks on the assembly lines require both precision and speed: workers fastening motherboards have a quota of 600–700 iPhones per day; for those polishing screens, it may be as high as 1,700. Performance management is based on on-the-spot fines and public shaming for substandard productivity. The demands of the work and the cruelty of the culture takes its toll: in 2010 alone, there were fourteen suicides on the premises. Nets were attached to Longhua’s high-rise dormitory buildings to catch falling bodies; when the journalist Brian Merchant bluffed his way through the security perimeter in 2017, they were still there.

Electronic components come to Longhua from other factories, like Pegatron’s 50,000-worker facility on the outskirts of Shanghai. In 2013, an NGO listed eighty-six labour rights violations there, in areas ranging from discrimination in recruitment to abuse by management, and from inadequate pay to poor safety standards. The following year, an undercover BBC reporter for the investigative documentary series Panorama was forced to work eighteen days in a row making MacBook parts, despite repeatedly asking for a day off. Another reporter, whose shifts lasted up to sixteen hours, was billeted with eleven other workers in an eight-person dormitory. Despite commitments by Apple to improve conditions, in 2016 the majority of Pegatron workers were still working more than a hundred hours of over-time per month.

Minerals and metals are integral to the manufacture of electronic components, and these come to Pegatron’s factory in Shanghai from mines across the Global South. The BBC found an illegal tin mine in Bangka, Indonesia, where twelve-year-old children were digging ore from twenty-metre-high sand and mud walls, at risk of being buried alive by a landslide. A worker at the mine told journalists that the tin was sold to smelters on Apple’s approved supplier list. Around 70 per cent of Indonesian tin comes from small mines similar to the one in Bangka, and it is practically impossible for Apple to be sure that its components don’t use the products of child labour.

Other iPhone and MacBook parts also require minerals and metals. Rechargeable lithium-ion batteries need cobalt, which has to be extracted before being shipped to refineries in China and Scandinavia. Approximately 60 per cent of it originates in the Democratic Republic of Congo’s Katanga province, where 35,000 children, some as young as six, work in appalling conditions in so-called ‘artisanal’ mines. For wages of less than $2 a day, they risk poisoning from toxic dust, and serious injury or death from tunnel collapses. Capacitors, meanwhile, require tantalum sheet, which is made from coltan ore. Around 80 per cent of the world’s coltan reserves are in the eastern part of the DRC. Despite efforts by Western governments to outlaw ‘conflict minerals’, much coltan extraction and transportation is still controlled by independent militias and factions of the Congolese army that are notorious for perpetrating forced labour, systematic rape and mass killings.

None of this is to suggest that Apple does not care about the wellbeing of workers at its suppliers, or about child labour and violent conflict. In 2018, it audited more than a thousand suppliers in forty-five countries for compliance with its supplier code of conduct, and it can point to many examples of educational initiatives that it has sponsored to improve the wellbeing and prospects of workers at its suppliers. Nor is it to suggest that Google and Facebook are not implicated in the same injustices: after all, they make devices like the Nexus phone and the Portal video calling display. The point is that it suits Tim Cook for public discourse to focus on issues of data and privacy, while hardware supply chain issues fade into the fog. If we are honest with ourselves, it suits us too. Righteous indignation about how our data is being used is easy; recognising the ways in which our own everyday use of digital devices makes us complicit in the exploitation of so many people is not.

A key element of Shoshana Zuboff’s argument in The Age of Surveillance Capitalism is the idea that digital technology companies have invented a ‘rogue capitalism’. However, attending to the material reality of how digital devices are made reminds us that extraction, coercion, violence and inequality have been persistent features of conventional capitalism for hundreds of years. It highlights the continuity from slavery and indentured labour on colonial plantations, through the hazardous chemical factories and garment sweatshops that blighted twentieth-century globalisation, to the cobalt mines of Katanga and the assembly lines of Longhua in our own time. We mitigate the ills capitalism produces with laws, regulations and standards, but most of all with scrutiny from civil society. And for this scrutiny to be effective, we must be able to see clearly.

Moderation in All Things

What other issues is the fog obscuring? Despite concerns about the harmful effects of social media, most Western users’ experience of platforms like YouTube, Twitter and Facebook is generally free from graphic violence, propaganda, pornography, scams, spam and hate speech. Contrary to what big tech companies like to imply, this is not all down to their machine learning algorithms; it’s because 100,000 people around the world work full-time to moderate online content. Like the miners, smelters and assembly line workers behind the iPhone, most of these people are not employed by the tech companies themselves. Instead, they work for specialist suppliers like TaskUs, or IT outsourcers like Accenture.

With a high proportion of English speakers and cheap labour costs, Manila in the Philippines has become the world’s capital of content moderation. Its new recruits are often recent graduates, paid between $1 and $3 per hour to review the worst words, video and imagery on the web, before deciding what is in violation of the platforms’ community standards and should be removed. As at Longhua, expectations of productivity are high and tolerance of mistakes is low: Twitter moderators review up to 1,000 items a day, while making three errors in a month while moderating Facebook is enough to get you fired. But the work can also be psychologically traumatising: some moderators who review footage of terror attacks report paralysing fears of public places; others who review images of sexual abuse report damage to relationships; others find themselves tempted to recreate acts of self-harm they have witnessed. Little counselling support is available.

While young Filipinos bear the emotional costs of keeping our social media feeds clean, they also suffer a consequence of tech companies’ assumption that American free speech norms are universal: the rule of president Rodrigo Duterte. Duterte’s signature policy has been wiping out the trade in crystal meth. Many reasonable people might be sympathetic to that objective, but it is Duterte’s methods which have seen him condemned by the European Union, the United Nations and even the Roman Catholic Church. His repeated incitements of violence against suppliers and users of meth have led to the unlawful killing of as many as 29,000 Filipinos by the police, contract hitmen and vigilante gangs. Social media was key to his election victory in 2016 and remains at the heart of his communication strategy; his team uses a network of volunteers, celebrity influencers and inauthentic accounts to distribute propaganda, smear political opponents, denounce journalists and harass critics of his government on Facebook, Twitter, Instagram and YouTube.

Social media has also been used to persecute a minority in Myanmar. Since 2016, Facebook and Facebook Messenger have been used to spread hate speech about Rohingya Muslims, building popular support for a brutal campaign of ethnic cleansing. Villages have been burned to the ground by the Burmese army and tens of thousands of Rohingya have been raped and murdered, while hundreds of thousands more have been forced to flee into neighbouring Bangladesh.

These urgent issues have nothing to do with data-driven targeting or advertising-based business models. And yet, groping around in the fog, data and ads are the only things policy-makers seem able to get hold of. It isn’t horrific ad copy that scars content moderators in Manila, and Duterte’s government and the Burmese military aren’t using sophisticated ad targeting techniques to win support. Instead they rely on the core features of digital platforms – groups, pages, channels, hashtags and encrypted messaging. These, of course, are the exact same freely-available digital tools that are so important for resisting authoritarianism in places like Zimbabwe, meaning that forcing tech companies to pull out of politically volatile countries can’t be the answer, either.

Although they may not realise it, tech companies also provide a way in which abuses like those occurring in Myanmar and the Philippines can be investigated. This is the work of the open-source intelligence community, a network of journalists, human rights workers and volunteers who verify the authenticity of videos and images shared on social media that appear to document human rights violations. Using openly available tools like Photo Sphere on Google Maps and the Facebook search function, they first aim to establish whether the footage shows what it purports to; then they work out where it was recorded and the identities of the people in it, building the evidence needed to bring perpetrators to justice.

In July 2018, a video of women and young children being executed by firing squad somewhere in rural West Africa began circulating on social media. In the online comments, some people insisted that the footage was staged, while others claimed that soldiers from Cameroon’s army were responsible – a suggestion angrily dismissed by its government as fake news. There was also speculation that the killers were Boko Haram militiamen in Mali, dressed in army fatigues to mislead observers. Using open-source intelligence techniques, analysts from the Digital Verification Corps at Amnesty International were able to uncover the truth. Visible in the background of the video were terraced crops, low-growing vegetation and a distant mountain range: by triangulating the footage with satellite pictures from Google Earth, they were able to place it in the Far North Region of Cameroon, close to an army outpost. They identified the weapons carried by the killers as an unusual type of Serbian rifle, known to be used by Cameroon’s army. Searching Facebook, they found images of Cameroonian soldiers wearing uniforms that matched those in the video, and even the profile page of one of the gunmen. Presented with this evidence, the country’s government reversed its position and arrested seven of its soldiers.

Open-source intelligence is particularly helpful where it is difficult for human rights organisations to gather evidence on the ground – war-torn Syria being a good example. But like the search data analytics we discussed in the last chapter, open-source intelligence is in danger of becoming collateral damage in the backlash against big tech. Social media companies are under pressure to be more proactive in removing harmful content from their platforms and in mitigating risks to their users’ privacy. Pressure to remove content glorifying terrorism led YouTube to take down thousands of mobile phone videos of the conflict in Syria, which NGOs had archived to help with human rights investigations. Pressure to do more to protect users’ profile data led Facebook to deprecate Graph Search – the feature enabling all public content on Facebook to be searched, which the Digital Verification Corps had used to corroborate reports of a hospital bombing in Idlib in Syria and compile evidence that senior Burmese officials had directly ordered atrocities against the Rohingya people. Such decisions put ‘our’ user experience and concerns about being targeted with ads ahead of justice for the murdered and oppressed.

This brings us back to the Philippines. That evidence of abuse is being assiduously deleted from social media in a country where open-source intelligence might be the best hope of holding perpetrators to account is darkly ironic. And while advances in artificial intelligence might bring some respite to Manila’s content moderators, they would also bring unintended consequences. To quote Sam Dubberley from Amnesty International, ‘In the worstcase scenarios, algorithms will be able to remove these videos almost as quickly as human rights defenders can post them, with possibly devastating impact for investigators. We can’t ask for a video to be reinstated or used to build a case against a warlord if we never knew it was there in the first place.’

* * *

By now, you’ll probably have realised that I think many current policy proposals on data and tech are short-sighted. We’ll consider alternative policies in Chapter Eight, but for now I hope it’s clear why we should broaden the debate beyond the concerns that seem most pressing in the West.

In case you’re wondering what happened to Henry Olonga, he built a career as a commentator and public speaker, and continued to play cricket for the celebrity Lashings XI, until he was forced to retire through injury. He married Tara Read, an Australian PE teacher he had met while training at a cricketing academy in Adelaide – where they now live with their young family. In 2019, Henry, who is also a fine operatic tenor, made it through to the battle rounds of The Voice Australia. It would have been a nice coincidence for this book if will.i.am had been one of the judges, but he’d been replaced in that series by Boy George.