Chapter Two

MIND GAMES

My first visit to Facebook’s international headquarters in Dublin got off to a terrible start. It was February 2016 and we had been building Bought By Many for about three-and-a-half years. We had grown to sixteen employees in total, and there were seven in my marketing team. I’d persuaded Helen, who I knew from Experian, to leave her job at JustGiving and lead customer acquisition at Bought By Many. She in turn had hired Gordon, an expert in Google advertising, and Lyes, a bright graduate to train in Facebook ads. For about eighteen months, Helen and Lyes had tried in vain to persuade Facebook to assign us an account manager, so we could talk to a specific person about issues with our campaigns and ideas to improve their performance. Suddenly, out of the blue, I received a phone call from what looked like an Irish number. ‘Sam,’ it began in an East Coast drawl, ‘my name is Sean Maloney, and I’m your dedicated Facebook account manager.’ Sean was calling to invite us to a workshop in Dublin with a group of other start-ups and various members of Facebook’s team. When he followed up by email, he told me we’d been transferred to a different account manager called Joe. I wondered why, but was too excited by the prospect of finally being able to engage with people at Facebook to dwell on it for long.

I also had the immediate task of convincing Steven that the whole acquisition team should go to Dublin. By this time, over one hundred thousand people had become members of Bought By Many groups and the revenue from introducing them to insurance companies was growing quickly. However, we were still a long way from turning a profit, and the time when raising another round of investment would become urgent was approaching. In other words, money was tight. Steven thought one of us should just go for the day and report back on what they’d learned, but I was thinking more broadly: here was an opportunity for us to start building a real relationship with Facebook, get answers to the questions we had about the mercurial ways of their algorithm and bond as a team in the process. I worked out that if we caught an early morning Ryanair flight and stayed one night in a cheap hotel, all four of us could do the trip for less than £500. I pointed out to Steven that it would only need a tiny improvement in the click-through rate of our Facebook campaigns to return this small investment, and he gave in.

Things started going wrong at Gatwick. Bleary-eyed, we met for breakfast in the terminal at 6 a.m. to find that our flight had been delayed by three hours, meaning we had no chance of being at Facebook’s offices in time for the 11 a.m. start. To make matters worse, we couldn’t reach Joe; for reasons I didn’t yet understand, we hadn’t been given his mobile number, and he hadn’t replied to the email Lyes had sent him. I had visions of us being turned away at the door of Facebook’s office and booted out of their account management programme. When we finally made it to Dublin, we sprinted through arrivals and piled into a taxi, only for the driver to warn us that it would be a slow journey: a funeral was taking place for a prominent gang member, and there were traffic restrictions and heightened security on the roads.

Finally the taxi dropped us off among the generic office buildings of Dublin’s docklands. Having heard about the grandiosity of Apple Park in Cupertino, California, seen the playfulness of the Googleplex in the film The Internship and read about the ‘village’ feel of Facebook’s campus in the San Francisco Bay Area, I was a little underwhelmed. Still, we had got there, even if we were an hour and a half late. We walked up the concrete steps and through a revolving door into an atrium. While the receptionist tried to get hold of Joe, I looked around at banquette seating, security pass-operated barriers guarding access to the lifts and a contraption dispensing covers for wet umbrellas. One or two people wearing lanyards were standing around chatting. There were a few slogans on posters at the mezzanine level – ‘DONE IS BETTER THAN PERFECT’, ‘PROCEED AND BE BOLD’, ‘WHAT WOULD YOU DO IF YOU WEREN’T AFRAID?’ and the now-infamous ‘MOVE FAST AND BREAK THINGS’ – but little else to let you know you were at the international HQ of one of the world’s most innovative companies. Posters aside, we could have been visiting one of Bought By Many’s insurance clients.

Eventually, a rosy-cheeked young man in a hoody wandered over and asked, ‘Are you Brought By Many?’ This was Joe. I introduced him to the team, explaining our roles and what each of us was hoping to get from the workshop, while emphasising the word ‘Bought’ as many times as possible. I apologised for our lateness and said that I hoped we’d be able to catch up on what we’d missed. ‘Oh,’ he said, ‘actually, we didn’t start the meeting yet. Some other people got delayed too, so we thought we could have a tour first and do the presentations after lunch. So tell me, what does Brought By Many do?’

As we joined the rest of the tour group, I asked Joe how long he had worked at Facebook and what he’d done beforehand. He explained that he’d moved to Ireland after graduating to work for a small digital advertising agency. A year later he’d got a job at a larger agency in Dublin, and then Facebook had approached him. This was his first account management role and he was only a few months into it. ‘So how many clients do you look after?’, I asked him. ‘Hmmm,’ he replied, ‘perhaps seventy-five or eighty? To be honest with you, I lose count!’

At Experian, like most B2B companies, we’d employed a lot of account managers. As an Experian client, if you spent in the tens of thousands of pounds on our products each year, you’d have an account manager. They would probably be relatively early in their careers, but they’d have a good understanding of Experian’s product range, and they’d ask enough questions about your business to show that Experian was interested in your success. If you’d been invoiced incorrectly or couldn’t log in to an Experian system, they would immediately sort it out for you. If you spent in the hundreds of thousands of pounds each year, a more senior account manager would look after you. They would take you out for lunch, invite you to drinks receptions and make sure you had priority access to new product features. They might even proactively conduct analysis to identify business opportunities for you. If you spent in the millions of pounds each year, you were assigned a ‘strategic account director’, who would bring the regional CEO or other luminaries to meetings with you, while knocking heads together to make sure you got whatever you needed. You’d be under no illusions about how much your business was valued.

Facebook, however, does none of this. At the time of our Dublin visit, Bought By Many was spending close to half a million pounds a year on Facebook ads, but that got us barely 1 per cent of Joe’s time. A few weeks later, we were late paying an invoice when our finance manager was off sick and our access to Facebook’s ad platform was instantly suspended. We couldn’t contact Joe by phone, as Facebook account managers aren’t allowed to give out their mobile numbers. When Joe eventually replied to an email from Lyes, he told him the process for accounts in arrears couldn’t be overruled, no matter how much we had spent in the past or how many times we had paid promptly before. Our sales ground to a halt, as we waited eleven days for Facebook’s accounts payable department to acknowledge our belated payment.

What explains this disregard for normal account management practice? I think it’s because, for Facebook’s sales and account management organisation, revenue accumulation is effortless. They have been able to surf a wave created by extraordinary user growth and the resulting shift of advertising budgets away from other channels. At the same time, they have benefited from the unprecedented user-friendliness and accessibility of the self-service advertiser tools that Facebook’s product managers and developers have created. Their job is so easy that not hitting their targets might be a greater achievement. Unfortunately, that doesn’t create the conditions for appropriately resourcing an account management function, or for a culture of providing good service to clients.

I wasn’t enjoying the tour. We were shown a cavernous cafeteria with napkin dispensers, tubs of ketchup sachets and a passive-aggressive sign indicating which direction to queue in. We went up one floor to an open-plan office area that was depressingly familiar, with banks of rectangular desks on grey carpet tiles, under strip lights and ducting. There were coats on the backs of wheeled office chairs and rucksacks on top of wheeled pedestals. There were bottles of hand sanitiser and discarded takeaway coffee cups. There were laptop docking stations with electrical safety testing stickers and trailing ethernet cables. A4 printer paper was stacked in corners next to neglected houseplants. Old leaflets and brochures spilled out of cardboard boxes. Whiteboards stood vacantly in front of windows or next to concrete pillars. The roller blinds were down, keeping the daylight off generic monitor screens. Quirky conference room names like ‘Ted and Dougal’ and ‘Quantum of Salads’ only served to underline how utterly ordinary the epicentre of Facebook’s international operations was.

My mood did not improve when the workshop finally started. A couple of Facebook account managers who were even younger and less experienced than Joe went through a presentation that explained the absolute basics of measuring online ad effectiveness. I finally let go of my hope that we were going to get the inside track on the newsfeed algorithm. A guy from Facebook’s creative team Skyped in from London and encouraged us to spend more on awareness-building video ads, until he got kicked out of the booth he was using and abruptly left the meeting. It was clear by this point that there would be no great revelations. Nobody at Facebook could teach us anything, and there was nothing worth knowing about Facebook ads that Helen and Lyes weren’t already doing. Like the characters in The Wizard of Oz, we’d gone behind the curtain and found that there was no magic. I wondered how I was going to explain this to Steven when we got back to London.

Then something unexpected happened. A young man in black jeans and a crisp white Oxford shirt took to the stage. Joe introduced him as Kristian Larsen, the co-founder of a specialist Facebook ads agency called PL & Partners, and said that he would be presenting a client case study. With a modesty and dry humour that I recognised as typically Danish, Kristian described the approach he’d taken to helping a T-shirt store develop its online business using Facebook ads. Budget was tight, so he’d started with a ‘retargeting’ campaign – he showed Facebook ads to people who had already visited the T-shirt website but left without buying anything. Using the Facebook ads feature called Custom Audiences, he’d shown different images to people, depending on which T-shirts they’d looked at. At the same time, through a process marketers call optimisation, he’d made sure the performance of the campaign kept improving by iteratively testing variations of the ads’ other ingredients – the headline, the message, the positioning of the company logo and whether the wording on the button should be ‘Læs mere’ or ‘Køb nu’ (‘Read more’ or ‘Shop now’). Once he was sure he was getting the most from the store’s existing audience, Kristian had turned his attention to attracting new customers. He’d created a list of the email addresses of the customers who had spent the most on T-shirts in the past and uploaded it to Facebook. With Facebook’s Looka-like Audiences feature, he was able to target a much larger group, whose profile and behavioural data suggested they were similar to the store’s most valuable existing customers.

The results were impressive: in just four months, the store’s website traffic had nearly doubled, its sales had grown 57 per cent and the return on investment in the Facebook ads was 10x. Even more impressive to me was the structured, data-driven and repeatable approach Kristian had described; I’d never heard Facebook articulate how to use its tools with such clarity. After the workshop had ended, I made a point of seeking him out and introducing myself to him and his co-founder, Mads. We agreed to meet up at their offices the next time I was in Denmark.

Buoyed by this conversation, Helen, Gordon, Lyes and I made our way over the River Liffey and checked into our hotel. We had some friendly banter with the receptionist as he noticed that we had three rooms between four and speculated about who would be sharing with who. Having dropped off our bags, we headed out for a bite to eat and a pint of Guinness, unaware that at that very moment he was using the details from my company card to order £1,500 worth of garden furniture from the Argos website.

Psychological Manipulation?

In relaying Kristian’s presentation, I touched on the most important techniques and technologies that Facebook marketers have at their disposal: optimisation, Custom Audiences and Looka-like Audiences. Surveillance capitalism theory argues that these techniques amount to psychological manipulation, another strong claim. Manipulation is different from legitimate forms of influence, such as inducement, encouragement or persuasion, because it conflicts with the interests of the person on the receiving end of it. If Facebook ads are manipulative, it means they are so powerful that they make people do things they would not otherwise do. I think surveillance capitalism theory gets this badly wrong, and believe that media reporting of the Cambridge Analytica scandal is partly to blame.

The whistleblower Christopher Wylie, who worked as a part-time contractor at Cambridge Analytica for nine months in 2014, revealed that the company had promoted personality quizzes to a sample of Facebook users. People’s answers to the quiz questions enabled Cambridge Analytica to build a ‘psychographic’ model intended to predict any Facebook user’s personality based on their profile data. In this context, ‘personality’ is defined in terms of the so-called OCEAN segmentation, which classifies individuals according to openness, conscientiousness, extraversion, agreeableness and neuroticism. By subsequently obtaining profile data from tens of millions of Facebook users, Cambridge Analytica was able to apply its model’s predictions to a much larger population. It then targeted different political ads to these people, depending on their personality type: for example, American Facebook users with high neuroticism saw ads that stressed the risks posed to national security by immigration, while users with high openness saw ads that emphasised its benefits to the economy. Secret filming by Channel 4 News of Cambridge Analytica’s senior staff, including its CEO Alexander Nix and managing director Mark Turnbull, appeared to substantiate Wylie’s claims that Cambridge Analytica had used Facebook as a ‘psychological warfare tool’ on behalf of its political clients.

Most proponents of surveillance capitalism theory take these claims at face value, but there are two good reasons to be sceptical of them. The first is to do with the incentives created by the structure of the digital advertising market. Owners of digital advertising space like Facebook are on the supply side of the market, and have incentives to talk up the value of their product (advertising space) to their clients. Their objective is often to persuade advertisers that they should pay a premium for space that will be seen by a high proportion of their target audience. The advertising market, meanwhile, is full of intermediaries, including agencies like PL & Partners and so-called ‘adtech’ firms – businesses that use software and data to improve the returns of media owners or advertisers. Acting for the sellers of advertising space, media sales agencies and supply-side adtech companies are incentivised to exaggerate its value. Acting for the advertisers, media buying agencies and demand-side adtech companies are incentivised to give a rosy account of the benefits of purchasing premium advertising space in order to justify the commission or software licence fees their clients are paying them.

All claims about the power of Facebook data, the precision of its targeting and the effectiveness of campaigns on its platform need to be understood in this context. Offering political advertisers a mixture of Facebook media buying and technology, Cambridge Analytica, like other demand-side intermediaries, had strong incentives to talk up both the value of Facebook’s advertising space and their ability to exploit it. Put simply, the sales messages articulated by Wylie, Nix and Turnbull are not reliable evidence of Cambridge Analytica’s capabilities, let alone of the effectiveness of those capabilities. The data scientist Aleksandr Kogan, who created the psychographic model, claims that it correctly predicted OCEAN personality traits in only 1 per cent of cases. However, Cambridge Analytica didn’t care that it was ‘useless’, because their client – the Ted Cruz presidential campaign in 2016 – was willing to continue paying for it. Even if the model had been good at predicting personality traits, there is no evidence that tailoring Facebook ads using psychographics is any more effective than conventional forms of targeting. As with subliminal TV advertising in the 1950s, the fact that marketing techniques seem creepy doesn’t mean they work. And from my own perspective as a marketer, psychographic targeting seems unnecessarily complicated compared to approaches like the one Kristian used to help his client sell more T-shirts.

The second reason to be sceptical about the Cambridge Analytica claims is that they muddle distinct techniques and technologies. You might have heard that Facebook and other tech companies continually modify the user interfaces of their apps to ‘nudge’ you towards particular actions. This is true: just as Kristian tested small variations of T-shirt ads to maximise the number of clicks, Facebook continually tests new ideas to see which designs encourage the greatest number of people to tap on a Messenger notification or include more hashtags on an Instagram post. You’ll recall that marketers refer to this technique as ‘optimisation’. It’s highly data-driven, but contrary to what surveillance capitalism theory suggests, the data involved isn’t individual profile data – it’s aggregated behavioural data. If people’s psychological profiles were being used to nudge them there would be every reason to take offence, but that isn’t how optimisation works; instead, it’s conducted on a mass scale and without differentiation between users. It doesn’t make sense to talk about optimisation manipulating people based on their individual psychological vulnerabilities, as it’s not remotely personal.

There are some well-known examples of optimisation playing a role in politics. Dominic Cummings wrote that during the UK EU referendum, Vote Leave ‘ran many different versions of ads, tested them, dropped the less effective and reinforced the most effective in a constant iterative process’. For some proponents of surveillance capitalism theory, being advocated by Cummings is reason enough for optimisation to be regarded as illegitimate. However, the approach he describes is methodologically identical to those used by the 2012 Obama presidential campaign, who found that an image of the president with his wife and children combined with a button marked ‘Learn More’ generated 40 per cent more donor registrations than an image of Obama by himself with a button marked ‘Sign Up’. When I interviewed Kristian for my academic research on Facebook three years after meeting him in Dublin, I asked him about Cummings’s account of Vote Leave’s Facebook ads strategy and he called it ‘Digital Marketing 101’. There is no wizard behind this curtain, either.

I think surveillance capitalism theory is also in a muddle over Custom Audiences – the Facebook feature that enables advertisers to target their existing customers with ads. To use it, the advertiser uploads a list of its customers’ emails, addresses, mobile phone numbers or Facebook IDs. Facebook then compares this list to its database and shows ads only to the people who appear on both lists. Proponents of surveillance capitalism theory seem to think this means that with the right data, people can be targeted with individually customised messages. But this simply isn’t the case: the minimum size for a Custom Audience is one hundred people, and you can’t show a specific ad to a group smaller than that.

Part of the confusion stems from language. Encouraged by software companies whose products make it easier to run optimisations and measure results robustly, marketers now commonly refer to their tests as ‘experiments’. This has the added benefit of reminding one’s colleagues that marketing is no longer about mood boards and boozy lunches: it’s a rational, quantitative discipline. Unfortunately, however, it has also led proponents of surveillance capitalism theory to conclude that Facebook users are continually being ‘experimented on’ without their knowledge or consent. This conflates an ethically dubious ‘experiment’ run by Facebook in 2012 to see whether the words people were exposed to on social media affected their emotional state with everyday common-or-garden optimisation ‘experiments’ of the type outlined in Kristian’s presentation. Another example is the renaissance of the term ‘dark post’. In the early days of Facebook ads, when the system was more rudimentary, the easiest way of advertising was paying to ‘boost’ something you’d already posted on your company’s Facebook page. ‘Dark post’ referred to a new post created for promotional purposes that you didn’t want to appear on your page in perpetuity. In other words, it was a synonym for ‘advert’. As Facebook Ads Manager became more sophisticated and the types of ads you could run multiplied, the term lost its meaning and declined in use. However, it’s now been revived by Facebook’s critics, seemingly because it implies that covert tactics and nefarious intent are inherent in Facebook ads.

It should not be assumed that Facebook ads are as persuasive as the sellers of them claim, and nor are the optimisation techniques used by marketers to increase their effectiveness as sinister as surveillance capitalism theory suggests. The dystopian narrative that has emerged from the Cambridge Analytica scandal is simply not supported by the often mundane reality of digital advertising operations.

Lookalike Audiences at the Experian Museum

If there is magic in Facebook ads, Lookalike Audiences is it. Explaining what it does is fairly straightforward: it enables advertisers to target prospective customers with similar characteristics to their existing customers. As with Custom Audiences, the advertiser uploads a list of its existing customers to Facebook, but this time Facebook shows ads to a new audience that it deems similar on the basis of their profile and behavioural data. When an advertiser uses Lookalike Audiences, they don’t make decisions about targeting criteria themselves, as I did with my diabetes travel insurance campaign; instead, the targeting decisions are made by Facebook’s machine learning algorithm. Explaining how the Lookalike Audiences algorithm works and why it is so powerful is more complicated.

I may have given the impression that meeting Kristian and Mads was the only good thing about visiting Facebook in Dublin, but that’s not quite true. On the top floor of their offices was a sort of company museum. It had installations about Facebook’s most technologically ambitious projects like the Oculus Rift virtual reality headset and the Aquila drone, a prototype for beaming Wi-Fi into remote parts of the developing world. It also showcased products from successful Facebook advertisers, including scooter helmets, collapsible water bottles with charcoal filters, luxury vegan chocolates, slippers with detachable soles, subscription boxes of scented candles, custom-painted cake stands and much more besides. The exhibits in the museum highlighted the fact that Facebook ads make possible a huge range of online businesses that wouldn’t have existed before. By enabling makers and inventors to market their products directly to the public, they level the playing field between small business owners and massive retail groups, increasing consumer choice and encouraging innovation. A venture capitalist I spoke to for my academic research went further: she believed that the explosion of UK start-ups centred around London’s ‘Silicon Roundabout’ couldn’t have happened without entrepreneurs having access to Facebook ads. Lookalike Audiences plays a key role: if you want to grow your cake stand business, you no longer need to run expensive focus groups or surveys to understand what sort of people are buying your cake stands, and then devise a marketing campaign around the TV programmes they watch or the newspapers they read. Instead, you can upload your existing customers’ email addresses to Facebook and let the Lookalike Audiences algorithm magically show your cake stands to the right people.

The methodology of Lookalike Audiences isn’t new. In fact, it was invented in the 1980s at data companies like Experian, where I used to work. If Experian had a company museum, a humble and largely forgotten object would have pride of place: the Kays mail order catalogue. You could turn its pages and browse Casio digital watches, coffee makers with integrated clock radios, He-Man action figures, Clairol foot spas and countless other retro items. Tucked into it would be an order form on which you could fill in the alphanumeric codes for the things you liked, and an envelope with a freepost address to send it off in.

The Kays catalogue was over a thousand pages long. Printed in colour, it was expensive to produce and to post. As a result, mail order companies needed to be selective about who they sent catalogues to in order to remain profitable. There was one obvious group to include in a catalogue mailing: people who had bought items from a previous edition. However, if you wanted to grow your mail order business, you couldn’t rely solely on existing customers – you needed to find new ones. But how on earth were you supposed to know which prospective customers to send your catalogues to? The answer was a technique called geodemographics.

image

Alternatives to the Sony Walkman in a 1980s mail order catalogue.

The principle of geodemographics is the old saying that ‘birds of a feather flock together’. In other words, your postcode can predict the sort of person you are – or at least, the sort of things you buy. For example, if you live in a detached house in a suburb where there is no public transport, you are far more likely to be interested in child car seats than someone who lives in a city-centre Edwardian terrace that’s been converted into studio flats.

To help visitors understand geodemographics in practice, my imaginary Experian museum has a virtual reality installation. You put the headset on and find you’ve travelled back in time to 1989 and are looking at the world through the eyes of the database marketing manager for the Kays catalogue. You catch sight of your own reflection: you’re wearing a stonewashed denim jacket and have massive hair. There’s a sickly-sweet taste in your mouth: it’s the Hubba-Bubba gum you’re chewing. You’re wondering how to meet a target your boss has set you for orders of electronics from new customers. You’ve already looked at the customer database to see which postcodes have the highest concentrations of past orders for hi-fi systems, VHS video recorders and radio cassette players. You know that by comparing the existing customer database with data from the electoral roll, you’ll be able to send a catalogue to the addresses in those postcodes that haven’t received one in the last two years; that should get you off to a good start, but it probably won’t be enough by itself. You go over to a boxy Amstrad computer terminal and monitor. At the top of the screen it says ‘MOSAIC’; below it is the instruction, ‘ENTER POSTCODE’, and a command line with a blinking cursor. You choose a postcode with a high share of past electronics orders and hit return. After a moment, the computer gives you a result ‘FIVE MOST SIMILAR POSTCODES TO MK45 1SN: SO40 3QN, BH16 5HQ, WA5 2NB, NG10 3JB, NR33 7BT.’ Now you know which addresses elsewhere in the country are most likely to order a colour TV or a microwave oven if you send them a catalogue. And what’s more, you don’t need to waste money sending catalogues anywhere else.

‘Mosaic’ is the name of Experian’s geodemographic classification system. Other data companies have similar systems, and they all work in a similar way: starting with a table of millions of names and addresses from the electoral roll, they add data points to as many rows as possible, building up a detailed picture of the type of people who live in each postcode. The data can come from publicly available sources like the census, from large-scale consumer surveys or it can be purchased from third-party companies (provided they’ve obtained consent from the people who shared it with them in the first place). The data includes a wide range of things, from the age and income of household members to whether they have pets and prefer caravans to hotel accommodation. It’s important to note that it isn’t necessary to have data for every row in the table; what matters is having enough data points for each postcode to paint an accurate picture. Once that’s done, they can be grouped with others sharing similar data characteristics. To make its groups more intuitive and easier to remember, Experian Mosaic gives them names: when I moved across London in 2013, from leafy Dulwich to edgy Hackney, I went from ‘Uptown Elite’ to ‘Flexible Workforce’ – from an area that was in a group with Morningside in Edinburgh (‘High status households owning elegant homes in accessible inner suburbs where they enjoy city life in comfort’) to one similar to Chorlton in Manchester (‘Successful young renters ready to move to follow worthwhile incomes from service sector jobs’). Meanwhile, the postcodes that offer the best prospects for selling 1980s electronics are in the group ‘Suburban Stability’ (‘Mature suburban owners living settled lives in mid-range housing’).

In the next room of the Experian museum would be interpretive boards explaining that since the 1980s, the real-world applications of geodemographic classification have extended far beyond mail order catalogues. By adding driving time data, Mosaic was able to show Experian clients like Greene King and Vue where they should open new pubs and cinemas. And it was geodemographic analysis using Mosaic that identified ‘Motorway Men’ – sales professionals living on the M1 and M6 corridors – as the key swing voters in the 2010 UK general election.

You might have noticed that my description of Experian Mosaic is similar to Facebook’s Audience Insights tool with which I identified people with an interest in complications of diabetes. Just as geodemographic targeting works by making inferences from postcode data, Facebook’s targeting works by making inferences from profile and behavioural data. Like the geodemographic targeting of people who are more likely to buy child car seats, the examples of Facebook targeting we’ve discussed so far have common-sense explanations: you like the Facebook page of a diabetes charity and you see an ad for diabetes travel insurance in your Facebook feed; you put a black V-neck T-shirt in your shopping basket and it follows you onto Instagram. Lookalike Audiences goes further: it upgrades geodemographics for the era of big data. Machine learning finds patterns in millions of points of profile data, behavioural data and metadata – the data about the data – that a human analyst would never be able to uncover. With these, it can create a classification system that is much more granular than Mosaic or anything else that has gone before.

It’s up to you whether you want to call Lookalike Audiences ‘magic’ or just ‘data science’, but is it fair to call it dark magic? Surveillance capitalism theory certainly thinks so. It points to the ways in which Lookalike Audiences has been used in elections – most famously in efforts by Donald Trump’s election campaign in 2016 to depress voter turnout among groups favouring Hillary Clinton. Through Facebook ads targeted using Lookalike Audiences, the attention of younger women was drawn to allegations of sexual harassment levelled at Clinton’s husband, while audiences likely to contain a high proportion of African-American men were reminded of comments she had made about ‘super predators’ twenty years previously.

Does responsibility for this cynical and unsavoury use of Lookalike Audiences rest with the Facebook technologists who created the tool? And do digital marketers in political parties share some of the blame, for finding ways of using Lookalike Audiences that Facebook didn’t think of? I’m not so sure. For better or for worse, we accept smear tactics as part of electoral campaigning: in this example, Lookalike Audiences is a transmission mechanism rather than a cause.

However, there is a darkness in the use of Facebook ads in political campaigning that the company needs to do more to address. Just as they are an incredibly efficient way of finding new customers for a T-shirt store, Lookalike Audiences are an incredibly efficient way for marginal political parties to recruit new supporters. Take Germany’s right-wing populist Alternative für Deutschland (AfD), which promotes nationalism and military conscription while opposing immigration, feminism, investment in renewable energy and equal rights for gay couples. It used Lookalike Audiences in its campaign for the 2017 federal election, and won ninety-one seats in the German parliament just four years after its founding. Looka-like Audiences has enabled the AfD to amass a following on Facebook that is more than twice the size of the followings of Germany’s main parties, the Christian Democratic Union and the Social Democratic Party. Furthermore, during the 2019 European elections, the AfD’s Facebook ads were systematically amplified by a network of Facebook groups and pages managed by a private individual who was sympathetic to the party’s policies.

It is beyond my pay grade to make claims about whether political parties like the AfD are legitimate, but I do think it’s a problem that Lookalike Audiences makes it so easy for new entrants in the political arena – including those whose views and policies are outside the mainstream – to find supporters. Facebook places hardly any controls on access to its advertiser tools, including Lookalike Audiences. Anyone can attempt to exert influence in an election, regardless of their agenda; you don’t need to be a member of the campaign team for a candidate or a political party – all you need is a Facebook Page and a credit card. The AfD’s European election ads were amplified by one of its supporters, but if a civil rights activist wanted to run Facebook ads criticising the AfD’s position on ‘traditional gender roles’, there would be nothing to stop them. In fairness to Facebook, the roll-out of its ‘ad authorization’ process means that in an increasing number of countries, it’s no longer possible to run Facebook ads about elections, social issues or politics without proving that you live there. That at least mitigates the risk of interference in domestic politics from overseas, but the fact that the barriers to entry in political Facebook advertising are so low is a real problem.

Another area in which Facebook disowns responsibility concerns the truthfulness of claims made in Facebook ads. Mark Zuckerberg’s rationale for this, which we’ll explore in Chapter Seven, arises from his commitment to upholding free speech. That might be a reasonable justification for Facebook not acting to remove ordinary posts or comments about politics unless they break the law, but for a long time I thought extending this logic to advertising was nonsensical. After all, freedom of speech is a human right, but freedom to advertise is not. In the UK, advertising on TV, in newspapers and in direct mail is kept honest through a combination of rules set by regulators like the Advertising Standards Authority and norms fostered by bodies like the Direct Marketing Association. The idea that ITV would allow an advertiser to make an unsubstantiated claim and invoke freedom of speech as a justification seems absurd. Similarly, it’s unthinkable that a mailing house could justify delivering tens of thousands of leaflets which made claims that were provably false. So why is this allowed on Facebook?

In 2010, the US Supreme Court ruled that laws restricting political advertising by independent organisations were unconstitutional, based on the free speech protections in the First Amendment. And as Facebook is an American company, it sees political advertising in these terms. Its default setting is to allow anyone to say whatever they like in political Facebook ads – even in Germany, the UK and other countries where the US Constitution doesn’t apply.

Meanwhile, ads that Facebook doesn’t regard as being about social issues, elections or politics are a total free-for-all. Multiple state-run Chinese news organisations have been able to amass larger Facebook followings than the BBC, the New York Times and CNN by promoting their content using Facebook ads. Throughout 2019, the China Global Television Network (CGTN) ran campaigns featuring iconic images of China, from baby pandas to high-speed trains, with a simple request to like their Facebook page. China Daily used the same tactics, but with a clickbait approach; nothing says ‘soft power’ like a video of a cockatoo eating shrimp. Both campaigns were seen by hundreds of millions of Facebook users all over the world.

Though overseas broadcasting by state media can be perfectly legitimate, Facebook is banned in China which creates an asymmetry: these advertisers benefit from Facebook’s reach but can’t be reached with ads themselves. This may reflect Mark Zuckerberg’s longstanding aspiration to enter the Chinese market, but it seems to run counter to the idea of Facebook as a community in which members have reciprocal responsibilities, not to mention America’s strategic interests.

Just as it does nothing to check the truthfulness of claims made in ads or the integrity of advertisers, Facebook does almost nothing to verify that advertisers using Lookalike Audiences have permission to use the data they are uploading – advertisers simply check a box confirming that their use of data is compliant with the applicable laws and regulations. In practice, unscrupulous advertisers use whatever data they can get. Of the eighty-eight advertisers who uploaded my data to Facebook in the seven days to 21 May 2019, sixty-nine of them didn’t have my permission, and the majority of them were car dealerships in the United States, where I have never lived. The consequences of this specific example are hardly grave: at worst, I might see ads that are irrelevant to my life. The point is to demonstrate the weakness of Facebook’s controls on Lookalike Audiences.

image

The Graph API

There is a final aspect of the Cambridge Analytica scandal that we haven’t yet touched on: how the data used to build a psychographic model was obtained in the first place. Many media reports called it a ‘data breach’, but that’s misleading. A breach is when data ends up in the hands of someone who doesn’t have permission to access it. That can happen deliberately when a hacker attacks a company’s computer systems, or accidentally if an employee leaves their laptop in a pub. By contrast, Aleksandr Kogan – the data scientist who built Cambridge Analytica’s notorious model – had Facebook’s permission to access Facebook users’ data through its Graph API, a pipe built into their system for third-party developers. At the time, Facebook was actively encouraging developers outside the company to collect huge quantities of users’ data in this way. It wasn’t a mistake that allowed Cambridge Analytica to harvest data: it was company policy.

Why would Facebook want to do that? Ironically, this goes back to Mark Zuckerberg’s personal dislike of advertising. Like a lot of software engineers, he doesn’t care for ads. In a parallel universe where he stayed at Harvard instead of dropping out to build Facebook, Zuckerberg might easily have been one of the ad rejectors I mentioned in Chapter One, posting rude gifs in the comment threads of travel insurance ads. He never wanted to build an ads business; he wanted to build a platform.

Rather than earning money by selling advertising space, platform businesses earn money by receiving small commissions on enormous volumes of third-party transactions that they play a part in facilitating – Airbnb and Paypal are two examples. The Graph API was designed to enable a platform business model; one example of the way Facebook intended it to be used involved the business directory Yelp. Pulling data from the Graph API enabled Yelp to improve restaurant recommendations for its users by taking into account the restaurants that they and their friends had liked on Facebook. Ironically, another early adopter was the Guardian, who worked with Experian to develop an app that allowed users to see which news stories their Facebook friends were reading. Ultimately, Facebook hoped to become ad-free and generate revenue from third-party developers whose apps benefited from Graph API data.

Note that both these examples from Yelp and the Guardian require not just Facebook users’ own data, but their friends’ Facebook data. That’s what I think was the only truly shocking aspect of the Cambridge Analytica scandal: the profile data of tens of millions of Facebook users was given away when their friends completed a personality quiz, entered a competition or installed a Facebook app. These users had no way of knowing that was happening, let alone of opting out. As a result, the $5 billion fine Facebook received from the Federal Trade Commission is hard to argue with.

But here’s where subscribers to surveillance capitalism theory get into a muddle again. Expressing his view that the fine was insufficient, Federal Trade Commissioner Rohit Chopra remarked, ‘The settlement imposes no meaningful changes to the company’s structure or financial incentives, which led to these violations. Nor does it include any restrictions on the company’s mass surveillance or advertising tactics.’ Did you see what he did there? Chopra made a causal connection between Facebook’s current advertising-based business model and its historic bad practice with the Graph API, even though the Graph API was created so that Facebook could develop a business model that didn’t involve targeted advertising!

One outcome of this confusion among regulators has been challenges to the legitimacy of the data analytics and targeting techniques I’ve described in this chapter. When I was bogged down in this during my research, I realised I needed to talk to someone who had thought deeply about the ethics of marketing over many years: Nigel Wilson, who was managing director of Experian Mosaic between 2007 and 2015, has been incredibly influential on the practice of targeted marketing. I caught the train to Nottingham to see him, and over lunch I asked him what he thought about a recent report from the Information Commissioner’s Office that had questioned whether the use of data analytics in political campaigning should be allowed. Nigel folded his arms and gave this bracing reply.

‘I disagree with the ICO. They’re missing the point. It’s not about the use of data for messaging and targeting – it’s about understanding the wider context and purpose of the people who are doing it. For decades, consumers have been receiving highly targeted messages that appeal to their wants, desires, hot buttons – that’s just the way commerce works. If you’re a political party, why wouldn’t you build segments that help you understand your audience better and tailor a message accordingly? Why shouldn’t you be able to tell young families about your plans to reform education, or tell commuters you’ll invest in transport infrastructure? The context for Cambridge Analytica is surely that people’s data was used without the necessary permission being granted. On top of that, Facebook didn’t apply the right controls. They didn’t check whether the targeting was being done by a wealthy citizen, an interest group, a foreign power. They didn’t check whether messages were appropriate or whether claims were accurate. Back in the 1990s, you couldn’t have used direct mail to target people with disabilities, say, with unsubstantiated claims about a miracle cure – the controls were in place to stop that happening. Frankly, it pains me that geodemographic targeting is identified as the issue.’

Rereading Nigel’s words, I feel like I’ve stepped out of a stuffy room into the sea air. It isn’t targeted marketing or Facebook’s advertiser tools that cause political problems: it’s the lack of controls on who can use them and on what they can say.