Keeping true to its origins in foreign information operations, there were new characters arriving at Cambridge Analytica’s London office almost daily. The firm became a revolving door of foreign politicians, fixers, security agencies, and businessmen with their scantily clad private secretaries in tow. It was obvious that many of these men were associates of Russian oligarchs who wanted to influence a foreign government, but their interest in foreign politics was rarely ideological. Rather, they were usually either seeking help to stash money somewhere discreet, or to retrieve money that was sitting in a frozen account somewhere in the world. Staff were told to just ignore the comings and goings of these men and not ask too many questions, but staff would joke about it on internal chat logs, and the visiting Russians in particular were usually the more eccentric variety of clients we would encounter. When the firm would conduct internal research on these potential clients, we would hear through the grapevine about the amusing hobbies or bizarre sexual escapades these powerful men would get up to. And I did, admittedly, turn a blind eye to the firm’s meetings with suspicious-looking clients. I knew it would just get me in trouble with Nix if I asked too many questions that he didn’t care for. But at the time, in spring 2014, just two years before any Russian disinformation efforts hit the U.S. presidential election, there wasn’t anything innately suspicious about these Russians beyond the typical bread-and-butter shadiness that the firm engaged in. That is, except for one prospective client that CA executives became both very giddy and unusually elusive about.
In the spring of 2014, the large Russian oil company Lukoil contacted Cambridge Analytica and began asking questions. At first, Nix handled the conversations, but soon the oil executives wanted answers that he was incapable of providing. He sent Lukoil CEO Vagit Alekperov a white paper I’d written about Cambridge Analytica’s data targeting projects in the United States, after which Lukoil asked for a meeting. Nix said that I should come along. “They understand behavioral micro-targeting in the context of elections (as per your excellent document/white paper) but they are failing to make the connection between voters and their consumers,” he wrote in an email.
Well, I was failing to make that connection, too. Lukoil was a major force in the global economy—the largest privately owned company in Putin’s kleptocracy—but I couldn’t see an obvious link between a Russian oil company and CA’s work in the United States. And Nix was of no help. “Oh, you know how these things are,” he told me. “You just lift your skirt a little, and then they give you money.” In other words, he wasn’t interested in the details. If Lukoil wanted to pay for our data, why should we care what they did with it?
Shortly after the first Lukoil approach, a 2014 memo on CA’s internal capacity was drafted and sent to Nix. The briefing discussed in euphemistic terms what the firm, at least in theory, was capable of setting up were there to be a project that needed special intelligence services or scaled disinformation operations on social media. (As the memo was internal, it referenced SCL; Cambridge Analytica was merely a front-facing brand for American clients that was entirely staffed by SCL personnel.) “SCL retains a number of retired intelligence and security agency officers from Israel, USA, UK, Spain & Russia each with extensive technical and analytical experience,” the memo read. “Our experience shows that in many cases utilizing social media or ‘foreign’ publications to ‘expose’ an opponent is often more effectual than using potentially biased local media channels.” The memo discussed “infiltrating” opposition campaigns using “intelligence nets” to obtain “damaging information” and creating scaled networks of “Facebook and Twitter accounts to build credibility and cultivate followers.” For many of SCL’s clients, this was a standard offer—private espionage, stings, bribes, extortion, infiltrations, honey traps, and disinformation spread through fake accounts on social media. For the right price, SCL was willing to do whatever it took to win an election. And now, armed with even more extensive data sets and AI capabilities, and millions invested, the newly formed Cambridge Analytica was looking to take this further.
The Lukoil execs came to London, where Nix had prepared a pitch deck of slides for the meeting. I sat back in my chair, curious to discover what the hell he was actually pitching. The first couple of slides outlined an SCL project in Nigeria aimed at undermining voters’ confidence in civic institutions. Labeled “Election: Inoculation,” the material described how to spread rumors and disinformation to sway election results. Nix played videos of emotional voters convinced that the upcoming Nigerian election would be rigged.
“We made them think that,” he said with delight.
The next set of slides described how SCL had worked to fix elections in Nigeria, complete with videos of voters saying how worried they were about rumors of violence and upheaval. “And we made them think that too,” Nix said.
I watched in silence as these Russian executives took notes, casually nodding along as if what they were seeing was totally routine. Next, Nix showed them slides about our data assets. But we didn’t have data assets in Russia or the Commonwealth of Independent States (CIS) markets that Lukoil primarily operated in, and our largest data set covered America. Then he started talking about microtargeting and AI, and what Cambridge Analytica was doing with the data in our possession.
I was still at a loss. At the end of the presentation, the executives asked me what I thought, and I fumfered a bit, saying, “Well, we have a diverse set of experiences and data in many places….So why exactly are you interested in all this?”
One of them responded that they were still figuring that out, and we should continue telling them more about what data and capacity CA had. But I was the one who needed answers. Why would a Russian oil company with virtually no presence in the United States want access to our U.S. data assets? And if this was a commercial project, why was Alexander showing them slides about disinformation in Africa?
But it wasn’t simply the internal data assets that were shown to the firm’s clients. The firm was eager to show off to prospective clients how much it knew about internal U.S. military operations. In another meeting, an internal slide deck created by the U.S. Air Force Targeting Center in Langley, Virginia, which the firm somehow had accessed, outlined for some prospective clients how the United States was already “incorporating socio cultural behavior factors into operational planning” in order to gain the “ability to ‘weaponeer’ targets” and amplify non-kinetic force against American adversaries. Nix remained coy about his plans. This struck me as against type—how many times had I seen him banter about the firm bribing ministers or setting up honey traps? But he couldn’t—or wouldn’t—explain why we kept communicating with this “client.” And during the discussions, he kept telling them, “We’ve already got guys on the ground.”
A FEW MONTHS BEFORE the first round of Lukoil meetings, Cambridge Analytica had connected with a man named Sam Patten, who had lived a colorful life as a political operative for hire all over the world. In the 1990s, Patten worked in the oil sector in Kazakhstan before moving into Eastern European politics. When CA hired him, he had just finished a project for pro-Russian political parties in Ukraine. At the time, he was working with a man named Konstantin Kilimnik, a former officer of Russia’s Main Intelligence Directorate (the GRU). Although Patten denies that he gave his Russian partner any data, it was later revealed that Paul Manafort, who was for several months Donald Trump’s campaign manager, did pass along voter polling data to Kilimnik in a separate instance. Patten and Kilimnik had met in Moscow in the early 2000s and later worked in Ukraine for Paul Manafort’s consultancy. The two became formal business partners soon after Patten was brought onto CA.
Patten was a perfect fit to navigate the world of shady international influence operations. He was also well connected among the growing number of Republicans joining Cambridge Analytica, so he was initially assigned to work in the United States. Patten was tasked with managing the logistics of research operations in America, including focus groups and data collection, and writing some of the polling questions. In spring 2014, he started working in Oregon, taking over for some of Gettleson’s projects conducting social and attitudinal research on American citizens.
Soon enough, weird questions began popping up in our research. One day I was in my London office, checking reports from the field, when I noticed a project involving Russia-oriented message testing in America. The U.S. operation was growing rapidly, and several new people had been brought in to manage the surge in assignments, so it was hard to keep track of every research stream. I thought that maybe someone had started exploring Americans’ views on international topics. But when I searched our repository of questions and data, I could only find data being collected on Russia. Our team in Oregon had started asking people, “Is Russia entitled to Crimea?” and “What do you think about Vladimir Putin as a leader?” Focus group leaders were circulating various photos of Putin and asking people to indicate where he looked strongest. I started watching video recordings of some of the focus groups—and they were strange. Photos of Vladimir Putin and Russian narratives were projected on the wall, and the interviewer was asking groups of American voters how it made them feel to see a strong leader.
What was interesting was that even though Russia had been a U.S. adversary for decades, Putin was admired for his strength as a leader.
“He has a right to protect his country and do what he thinks is best for his country,” said one participant as others nodded in agreement. Another told us that Crimea was Russia’s Mexico, but that, unlike Obama, Putin was taking action. As I sat alone in the now dark office, watching bizarre clips of Americans discussing Putin’s claim to Crimea, I wanted answers. Gettleson was in America at the time. When he answered the phone, I asked if he could enlighten me about who had authorized a research stream on Putin. He had no idea. “It just showed up,” he said, “so I assumed it was approved by someone.”
Patten’s interest in Eastern European politics crossed my mind, but I didn’t give it a lot of thought. In August 2014, a Palantir staff member sent an email to the data science team with a link to an article about Russians stealing millions of Internet browsing records. “Talk about acquiring data!” they joked. Two minutes later, one of our engineers responded, “We can exploit similar methods.” Maybe he was joking, maybe he wasn’t, but the firm had already contracted former Russian intelligence officers for other projects, as the memo to Nix highlighted.
Kogan, the project’s lead psychologist since May 2014, was making trips to St. Petersburg and Moscow. He was not forthcoming about his projects in Russia, but I knew he was working on psychometric profiling of social media users. The research Kogan was doing in Russia was focused on identifying disordered people and exploring their potential for trolling behavior on social networks. His research at St. Petersburg State University, funded by a Russian government research grant, examined connections between dark-triad personality traits and engagement in cyberbullying, trolling, and cyber stalking. The research also explored political themes on Facebook, finding that high scorers in psychopathy were most likely to post about authoritarian political issues. In conjunction with clinical and computational psychologists, Kogan worked with the “data of Facebook users from Russia and the USA by means of a special web-application,” according to one of the research briefings from his Russian research team. By late summer, Kogan was delivering lectures in Russia on the potential political applications of social media profiling. I remember him mentioning to me that there was “overlap” between his work in St. Petersburg and at Cambridge Analytica, but this could have been a coincidence. My own personal belief, which I expressed to Congress, was that Kogan was not ill-intentioned, but merely careless and naïve. Objectively, the security for the data was poor.
Even before Kogan came along, CA’s parent company, SCL, had deep experience disseminating propaganda online, but Kogan’s research was well suited to targeting voters with authoritarian personality traits, identifying narratives that would activate their support. After Kogan joined Cambridge Analytica’s project, CA’s internal psychology team started replicating some of his research from Russia: profiling people who were high in neuroticism and dark-triad traits. These targets were more impulsive and more susceptible to conspiratorial thinking, and, with the right kind of nudges, they could be lured into extreme thoughts or behavior.
IT TAKES TOXIC LEADERS to create a toxic enterprise, and I think Cambridge Analytica reflected the character of Nix. Along with the obvious delight he took in intimidation, Nix possessed an uncanny gift for finding just the spot where his malice would do the most damage. He wouldn’t stop calling me a “gimp” or “spaz case,” for example, because he knew it made me feel weak. But he knew it would only make me work harder for him. As much as I resented him, for some reason I became determined to prove him wrong. The constant abuse came with an explanation: Only “the truth” could motivate someone to rise to Nix’s standards. He also made sport of belittling staff, blowing through the office like some irritable tornado, tossing out insults as he passed.
On one occasion we thought Nix was going to hurt someone. I can’t even remember what provoked him this time, but for whatever reason, he flipped out and pushed everything off one of the interns’ desks. Nix was screaming, leaning in so closely that tiny droplets of spit were hitting the intern’s cheek. Tadas Jucikas, who was the largest of us, got up and walked over. “Alexander, it sounds like you need a drink,” he said. “How about you join me at the club.” After Nix left, the intern just sat there, breathing heavily, until another colleague suggested he leave for the day. We all cleaned up the mess Nix had made before he returned in a much more jovial mood, as if nothing had happened.
Sometimes he would blame the victim after losing his temper. “You always make me yell,” he’d say, as if he was not in control of his own voice. What disturbed me most was when he denied a tantrum even as I was still reeling from the effects. There is something quite powerful in being told, flatly, that the thing that upset you never happened; eventually you start to worry that you’ve gone mad. “You need to grow up and be less sensitive,” Nix would say. “I can’t trust you if you keep telling me that I lost my temper.”
We had one huge blowout that ended up having both short-term and long-term consequences. When Cambridge Analytica was officially formed, I kept refusing to sign my contract. Signing could have granted me shares, but I was nervous about making a long commitment to the firm. A voice in the back of my head warned against it.
The delay made Nix furious. Finally he snapped, locking me in a room, where he screamed and berated me. When that didn’t yield the desired result, he flipped over the chair beside me. As soon as he opened the door, I rushed out of the office and didn’t return for two weeks. We both knew that he needed me more than I needed him, because I was the only one who could build what he had promised the Mercers. But he was still too stubborn and haughty to tell me he was sorry, so after a while he asked Jucikas to convey an apology to me. I reluctantly came back to work, but I still refused to sign the contract.
CA’s client list eventually grew into a who’s who of the American right wing. The Trump and Cruz campaigns paid more than $5 million apiece to the firm. The U.S. Senate campaigns of Roy Blunt of Missouri and Tom Cotton of Arkansas became clients. And, of course, there was the losing House bid of Art Robinson, the Oregon Republican who collected piss and church organs. In the autumn of 2014, Jeb Bush paid a visit to the office. Despite having received millions from Mercer, Nix never bothered to learn much about U.S. politics, so he asked Gettleson to join him. Bush, who had come alone, began by telling Nix that if he decided to run for president, he wanted to be able to do it on his terms, without having to “court the crazies” in his party.
“Of course, of course,” Nix answered, signaling his intention to bluff and bullshit his way through the entire meeting. When it was over, he was so excited at the possibility of signing up another big American client, he insisted on immediately calling the Mercers with the good news, having apparently forgotten that the Mercers had told him on countless occasions of their support for Ted Cruz. Nix put Rebekah Mercer on speakerphone so that everyone could hear her reaction to the amazing meeting he’d just had.
“We’ve just had Governor Jeb Bush in the office, and he wants to work with us. What do you think of that?” he said proudly. After a pause, Rebekah replied flatly, “Well, I hope you told him very clearly that that’s never happening.” Then she hung up. Brutal.
And it wasn’t just presidential hopefuls who sought CA’s help. For evangelical leader Ralph Reed, Nix planned a lunch at the grand dining hall of the Oxford and Cambridge Club, on Pall Mall. Reed spent two hours describing his objectives and outlining how CA could help re-instill morality in an America fighting over same-sex marriage and other cultural issues. Nix left the meeting a little drunk. Back at the office, he announced in his typical outrageous fashion to everyone, “Well, there’s a closet case if I’ve ever seen one.”
For most of the time I was at SCL and Cambridge Analytica, none of what we were doing felt real, partly because so many of the people I met seemed almost cartoonish. The job became more of an intellectual adventure, like playing a video game with escalating levels of difficulty. What happens if I do this? Can I make this character turn from blue to red, or red to blue? Sitting in an office, staring at a screen, it was easy to spiral down into a deeper, darker place, to lose sight of what I was actually involved in.
But eventually, I couldn’t ignore what was right in front of my eyes. Weird PACs started showing up. The super PAC of future national security adviser John Bolton paid Cambridge Analytica more than $1 million to explore how to increase militarism in American youth. Bolton was worried that millennials were a “morally weak” generation that would not want to go to war with Iran or other “evil” countries.
Nix wanted us to start using disguised names for any client research in the United States and to state that the research was being conducted for the University of Cambridge. I tried to put a stop to this in an email to staff: “You cannot lie to people,” I wrote, citing possible legal consequences. The warning was ignored.
At this point I was feeling more and more as if I was a part of something that I did not understand and could not control, and that was, at its core, deeply unsavory. But I also felt lost and trapped. I started going out on all-night binges at late-night clubs or raves. A couple of times, I left the office in the evening, went out all night, and ended up coming back in without actually going to bed. My friends in London noticed that I was no longer myself. Gettleson finally said to me, “You don’t look well, Chris. Are you okay?” I wasn’t; I was despondent. There were days when I wanted to scream back at Nix, but something stopped me. I would go out, sometimes alone, and the loud music and constant contact with other dancing bodies made me feel as if I was still here and this wasn’t some dream. And if the music is loud enough, you can scream at the world and no one notices.
OUR WORK AT CAMBRIDGE ANALYTICA seemed to grow more nefarious every day. One project was described in CA correspondence as a “voter disengagement” (i.e., voter suppression) initiative targeting African Americans. Republican clients were worried about the growing minority vote, especially in relation to their aging white base, and were looking for ways to confuse, demotivate, and disempower people of color. When I found out that CA was beginning a voter suppression project, it really hit home. I thought about all the times I had gone to rallies in 2008, when Barack Obama was running, and I started to ask myself, How in the hell did I end up here? I told one of the new managers that, regardless of what the client wanted, it could be illegal to work on a project with the objective of voter suppression. Once again, I was ignored. I called the firm’s U.S. lawyers in New York and left a message asking them to call me back, but they never did.
IN JULY 2014, I was copied on a confidential memo sent to Bannon, Rebekah Mercer, and Nix by Bracewell & Giuliani, the law firm of Rudy Giuliani. Cambridge Analytica had sought advice on U.S. law regarding foreign influence on campaigns. The memo outlined the Foreign Agents Registration Act and was emphatically clear: Foreign nationals are strictly prohibited from managing or influencing an American campaign or PAC at the local, state, or federal level. The memo recommended that Nix immediately recuse himself from substantial management of Cambridge Analytica until “loopholes” could be explored. The Bracewell & Giuliani memo suggested “filtering” the work of CA’s foreign nationals through U.S. citizens. After reading the memo, I pulled Nix into a meeting room to urge him to heed the warning.
Instead, Cambridge Analytica started to require foreign staff members to sign a waiver before flying to America, accepting liability for any breach of election law. They were not informed of the advice from Giuliani’s firm. It set me off. I unloaded on Nix.
“What if they get prosecuted, Alexander?” I shouted. “That will be on you.”
“It’s their responsibility, not mine, to know what the rules are,” he replied. “They are adults. They can make decisions for themselves.”
But it was his decisions I was worried about, and I wasn’t alone. Filling me in on some new projects, a colleague on the psychology team shared similar concerns about how this research could be used to amplify, rather than moderate, racism in the populations Cambridge Analytica was focusing on. “I don’t think we should keep doing this research,” he said.
Originally, race was one of many topics the firm began exploring. This in itself was not unusual, as racial conflicts have played a significant role in American culture and history. Psychologists on the projects initially assumed this research would be used either for passive information about the biases of the population or even to help reduce their effects. But, lacking the kind of traditional ethics review that is a prerequisite of academic research, there was never consideration of how this research could be misused—no one thought about how this could go wrong.
I knew Bannon would go on rants about how America was changing too much, his prophetic notion of an impending great conflict, or his misreading of dharma in Hinduism, which bordered on fetishistic Orientalism. But many of us on the CA research teams brushed him off as just another eccentric person we had to placate in the bizarre world we worked in. Many of the CA staff had experience working in far more extreme circumstances on old SCL information operations projects around the world, so by comparison Bannon felt quite tame.
But as CA rapidly grew after Mercer’s investment, I hadn’t fully grasped the scale of the race projects we were involved in. The new managers that Nix and Bannon hired started excluding me from meetings, and I stopped getting automatically invited to project planning meetings. I thought this was another power trip by Nix, so I simply felt annoyed rather than suspicious. But one of the psychologists on the team started coming to me to show me some of the new race projects. He showed me the master document of research questions that were being fielded in America, and my stomach dropped when I started reading. We were testing how to use cognitive biases as a gateway to move people’s perceptions of racial out-groups. We were using questions and images clearly designed to elicit racism in our subjects. As I watched a video of a man who was a participant in one of the field experiments, who’d been provoked by a CA researcher’s guided questioning into spasms of rage, racist insults flying from his mouth, I started to confront what I was helping to build.
In our invasion of America, we were purposefully activating the worst in people, from paranoia to racism. I immediately wondered if this was what Stanley Milgram felt like watching his research subjects. We were doing it in service to men whose values were in total opposition to mine. Bannon and Mercer were more than happy to hire the very people they sought to oppress—queers, immigrants, women, Jews, Muslims, and people of color—so that they could weaponize our insights and experiences to advance these causes. I was no longer working at a firm that fought against radical extremists who shackled women, brutalized nonbelievers, and tortured gays; I was now working for extremists who wanted to build their very own dystopia in America and Europe. Nix knew this and didn’t even care. For the cheap thrill of sealing another deal, he had begun entertaining bigots and homophobes, expecting his staff not only to look the other way, but for us to betray our own people.
In the end, we were creating a machine to contaminate America with hate and cultish paranoia, and I could no longer ignore the immorality and illegality of it all. I did not want to be a collaborator.
Then, in August 2014, something terrible happened. A veteran SCL staffer, a longtime friend and confidant of Nix’s, returned from Africa severely ill with malaria. He came into the office red-eyed and sweating profusely, slurring his words and talking nonsense. After Nix shouted at him for being late, the rest of us urged him to go to the hospital. But before he could be seen at the hospital, he collapsed and tumbled down a flight of stairs, smashing his head hard on the concrete. He slipped into a coma. His brain swelled and part of his skull was removed. His doctors worried that his cognitive functioning might never be the same.
After Nix returned from visiting the hospital, he asked HR for guidance on liability insurance and how long he had to keep paying his loyal friend, still in a coma and missing part of his skull. This seemed callous in the extreme. It was in that moment that I realized Nix was a monster. Worse, I knew he wasn’t alone.
Bannon was also a monster. And soon enough, were I to stay, I worried that I would become a monster, too.
The social and cultural research I’d been enjoying only a few months before had given birth to this thing—and it was terrifying. It is hard to explain what the atmosphere was like, but it was as if everyone had become detached from the realities of what we were doing. But I had snapped out of the daze and was now watching a revolting idea become real. My head cleared, and the real-world consequences of Nix’s evil dreams began to haunt me. Late into the evening, unable to sleep, I would stare at the ceiling, my thoughts stalled between agony and bewilderment. One night, I called my parents in Canada, at 3 A.M. their time, to ask for advice. “Read the signs,” they said. “If you can’t sleep—if you’re making calls at all hours in a panic for answers—then you know what you should do.”
I told Nix that I was leaving. I wanted to get away from his psychopathic vision—and that of Bannon—as fast as I could. Otherwise I risked catching the same disease of mind and spirit.
Nix countered by appealing to my sense of loyalty. He made me think that I would be a bad person if I abandoned my friends at the firm. I was the one who had recruited people to work on Bannon’s project. They trusted me, and I didn’t want to betray them.
“Chris, you cannot leave me alone here with Nix,” said Mark Gettleson, who had joined the company in large part to work with me. “If you go, I go.”
I didn’t like the idea of walking out on my friends and colleagues, but I hated what Cambridge Analytica had become and what it was doing in the world. I told Nix that we could discuss how I would be phased out, but that I was definitely leaving. He did what came naturally—he took me to lunch.
The restaurant was in Green Park, not far from Buckingham Palace. As soon as we sat down, Nix said, “All right, then. I was expecting we would have this conversation eventually. How much do you want?”
I told him it wasn’t about the money.
“Come on,” he said. “I’ve run this firm long enough to know it’s always about the money.”
He mentioned that I’d never asked for a raise, unlike some of my colleagues, despite how little he’d been paying me. And it was true: I had one of the lower salaries in the office, about half of what others were making, whereas recruits for Project Ripon were taking home triple to quadruple that. When I shook my head, Nix said, “Fine. I’ll just double your salary. That should do it.”
“Alexander,” I said, “this is not some game I’m playing. I am leaving. I don’t want to work here anymore. I’m done with whatever this is.” My tone deepened, and he seemed to finally realize I meant it, because then he leaned toward me and said, “But Chris, this is your baby. And I know you. You wouldn’t abandon your baby out in the streets, would you?” He must have sensed an opening, because he took the idea and ran with it. “It’s just been born. Don’t you want to see it grow up? To know what school it goes to? If we can get it into Eton? To see what it accomplishes in life?”
He seemed pleased with the metaphorical flourish, but I wasn’t the least bit moved. I told him that I felt less like a father than a sperm donor, with no power to keep the baby from growing into a hateful child. Nix quickly pivoted, suggesting we set up a Cambridge Analytica “fashion division.”
“Jesus Christ, Alexander. Are you serious? Psychological warfare, the Tea Party…and fucking fashion trends? No, Alexander. That’s ridiculous.”
Finally, he got angry. “You’ll end up being the fifth Beatle,” he said.
The fifth beetle? I thought. Was this some kind of Egyptian parable? Something to do with scarabs? What in the world was he talking about? Not until later did I realize he was talking about the band that had formed three decades before I was born.
Even after I met him halfway, agreeing to stay on until the midterms in early November, Nix continued to insist that I was making a mistake.
“You don’t even understand the enormity of what you have created here, Chris,” he said. “You’re only going to understand it when we’re all sitting in the White House—every single one of us, except for you.”
Seriously? Even for Nix, this was grandiose. I could have had a nameplate in the West Wing, he told me. I was too stupid to realize what I was giving up.
“If you leave, that’s it,” he said. “Do not come back.”
I stayed for less than a year after Bannon took over and unleashed chaos. But looking back, I struggle to understand how I could have stayed even that long. Every day, I overlooked, ignored, or explained away warning signs. With so much intellectual freedom, and with scholars from the world’s leading universities telling me we were on the cusp of “revolutionizing” social science, I had gotten greedy, ignoring the dark side of what we were doing. Many of my friends did the same. I tried to convince Kogan to leave, too, and even when he conceded that the project could become an ethical quagmire, he decided to continue collaborating with Cambridge Analytica after I left. When I found out Kogan was staying, I refused to help him acquire more data sets for his projects, as I was worried that any new data I got for him could end up in the hands of Nix, Bannon, and Mercer. What in my mind was meant to become an academic institute was becoming just another player in Cambridge Analytica’s expanding web of partners. When I refused to continue helping Kogan, he demanded that I get rid of any data I had received from him, which I did. But this came at a huge cost for me personally, as Kogan had specifically added in fashion and music questions to the panels so I could incorporate the survey responses into my Ph.D. thesis on trend forecasting. With the basis of my academic work now gone, I knew I would have to give up my Ph.D., which had become the only thing that was keeping me going. But what bothers me most was how I had let Nix dominate me. I let him pick away at every insecurity and vulnerability I had, and then, in service to him, I picked away at the insecurities and vulnerabilities of a nation. My actions were inexcusable, and I will always live with the shame.
JUST BEFORE I LEFT Cambridge Analytica, the firm was planning more election work in Nigeria. As Nix had explained to Lukoil in his presentation about rumor campaigns, the African nation was familiar territory. Cambridge Analytica knew that numerous foreign interests had a hand in African elections, making it unlikely that anyone would care what the firm was up to—it’s Africa, after all. Following the frenzy of decolonization in the 1960s, many Western powers still felt entitled to interfere with their former African territories; the only difference now was the need for a measure of discretion. Europe had been built on African oil, rubber, minerals, and labor, and the mere fact of a former colony’s political independence was not going to change that.
With the Nigeria project, Cambridge Analytica pushed itself even deeper into psychologically abusive experiments. At the same hotel where Cambridge Analytica set up camp, Israeli, Russian, British, and French “civic engagement” projects operated behind fig-leaf cover stories. The unspoken belief shared by all: Foreign interference in elections does not matter if those elections are African.
The company was working nominally in support of Goodluck Jonathan, who was running for reelection as the president of Nigeria. Jonathan, a Christian, was running against Muhammadu Buhari, who was a moderate Muslim. Cambridge Analytica had been hired by a group of Nigerian billionaires who were worried that if Buhari won the election, he would revoke their oil and mineral exploration rights, decimating a major source of their income.
True to form, Cambridge Analytica focused not on how to promote Goodluck Jonathan’s candidacy but on how to destroy Buhari’s. The billionaires did not really care who won, so long as the victor understood loud and clear what they were capable of, and what they were willing to do. In December, Cambridge Analytica had hired a woman named Brittany Kaiser to become “director of business development.” Kaiser had the kind of pedigree that Nix drooled over. In their first meeting, Nix flirted with Kaiser, saying to her, “Let me get you drunk and steal all your secrets.” She had grown up in a wealthy area outside of Chicago and attended Phillips Academy, an exclusive private school in Massachusetts (alma mater of both Presidents Bush). She went to the University of Edinburgh and afterward got involved in projects in Libya. Once there, she met a barrister named John Jones who represented not only Saif Qaddafi, Muammar Qaddafi’s son, but also Julian Assange of WikiLeaks. Jones was a well-respected member of the British bar. Kaiser started consulting for him and, as a result, became acquainted with Assange. She started working at Cambridge Analytica toward the end of 2014, just as I was leaving.
Cambridge Analytica created a two-pronged approach to swaying the Nigerian election. First they would seek out damaging information—kompromat—on Buhari. And, second, they would produce a video designed to terrify people from voting for him. Kaiser traveled to Israel, where, according to her, she was introduced to some consultants by her contacts there. According to internal correspondence I saw about the Nigeria project, Cambridge Analytica also engaged former intelligence agents from a handful of countries. It is uncertain who, if anyone, at Cambridge Analytica knowingly procured the services of hackers, but what is clear is that highly sensitive material about political opponents—which may have been hacked or stolen—somehow ended up in the company’s possession. By gaining access to opposition email accounts, databases, and even private medical records, the firm discovered that Buhari likely had cancer, which was not public knowledge at the time. The use of hacked material was not unique to Nigeria, and Cambridge Analytica also procured kompromat on the opposition leader of St. Kitts and Nevis, an island nation in the Caribbean.
The hacking of private medical information and emails was disturbing enough, but the propaganda videos Cambridge Analytica produced were much worse. The ads, which were placed on mainstream networks, including Google, were targeted to areas of Nigeria where the population leaned pro-Buhari. A Nigerian surfing the news would encounter an ordinary-looking clickbait ad—a gossipy headline or a photo of a sexy woman. When the person clicked on the link, he or she would be taken to a blank screen with a video box in the middle.
The videos were short—just over a minute long—and they usually started with a voice-over. “Coming to Nigeria on February 15, 2015,” intoned a man’s voice. “Dark. Scary. Very uncertain.” “What would Nigeria look like if sharia were imposed as Buhari has committed to do?” The answer, according to the video, was the most gruesome, horrifying carnage imaginable. Suddenly the video cut to a scene of a man slowly sawing a blunt machete back and forth across a man’s throat. As blood spurted from the victim’s neck, he was thrown into a ditch to die. The earth around him was stained red. In another scene, a group of men tied up a woman, then drenched her in gasoline and set her on fire as she screamed in agony. These were not actors—this was actual footage of torture and murder.
A number of people left CA right after I quit, reasoning that if the firm had become too sketchy for me, the guy who knew all the secrets, then it was too sketchy, period. The Nigeria project, a new low, set off another round of departures. By March 2015, everyone I cared about—Jucikas, Clickard, Gettleson, and several others—had left Cambridge Analytica. But many others found a reason to stay. Kaiser stayed on until 2018, coming forward publicly after the firm was sinking under the weight of the evidence I had provided to the media and authorities. She later claimed not to have known CA was hiring hackers, telling a British parliamentary inquiry that she just thought they were good at “intelligence gathering” and using “different types of data software to trace transfers between bank accounts….I don’t really know how that works.”
AS I LOOK BACK at my time at Cambridge Analytica, some things make a lot more sense than they did in the moment, when I became conditioned to the weirdness of the place. There were always strange people coming and going—shady characters in dark suits; African leaders wearing oversize military hats the size of dinner platters; Bannon—so if every unusual event tripped you up, you wouldn’t have lasted long.
I know now that Lukoil has a formal cooperation agreement with the Russian Federal Security Service (FSB)—the successor to the Soviet KGB. And a member of the House Intelligence Committee later informed me that Lukoil often served as a front for the FSB, conducting intelligence gathering on its behalf. Lukoil executives had also been caught conducting influence operations in other countries, including the Czech Republic. In 2015, Ukrainian security services accused Lukoil of financing pro-Russian insurgencies in Donetsk and Luhansk. “I have only one task connected with politics, to help the country and the company,” Lukoil’s CEO, Vagit Alekperov, said of his role in geopolitics.
In fact, this is likely the primary reason they would have been interested in SCL. SCL had a long history in Eastern Europe, and in 2014 it was in discussions for another NATO project on counter-Russian propaganda. SCL had previously worked on campaigns in the Baltics that blamed Russians for political problems. “In essence, Russians were blamed for unemployment and other problems affecting the economy,” said one old report on the project. But beyond all that, just as Lukoil was funding pro-Russian insurgencies in Donetsk, SCL’s defense division was beginning countermeasure work to “collect population data, conduct analytics, and deliver a data-driven strategy for the Ukrainian government in pursuit of their goal to win back control of Donetsk.” This project was designed to “erode and weaken the Donetsk People’s Republic (DPR)” and would have made the firm a significant target for Russian intelligence gathering, which was known to operate through Lukoil in Europe.
In reality, when Nix and I met with these “Lukoil executives,” we were almost certainly speaking to Russian intelligence. They likely were interested in finding out more about this firm that was also working for NATO forces. That’s likely also why they wanted to know so much about our American data, and Nix probably struck them as someone who could be flattered into saying pretty much anything. It’s entirely possible that Nix did not know to whom he was speaking, just as I did not. What made these contacts all the more concerning was that they wouldn’t have needed to hack Cambridge Analytica to access the Facebook data. Nix had told them where it could be accessed: in Russia, with Kogan.
This is not to say that Kogan would have even known about this, but gaining access to the Facebook data would have been as simple as keylogging his computer on one of his lecture trips to Russia. In 2018, after the U.K. authorities seized Cambridge Analytica’s servers, the Information Commissioner’s Office subsequently stated that “some of the systems linked to the investigation were accessed from IP addresses that resolve to Russia and other areas of the CIS.”
It’s eye-opening to summarize what was going on over those final months of my tenure. Our research was being seeded with questions about Putin and Russia. The head psychologist who had access to Facebook data was also working for a Russian-funded project in St. Petersburg, giving presentations in Russian and describing Cambridge Analytica’s efforts to build a psychological profiling database of American voters. We had Palantir executives coming in and out of the office. We had a major Russian company with ties to the FSB probing for information about our American data assets. We had Nix giving the Russians a presentation about how good we were at spreading fake news and rumors. And then there were the internal memos outlining how Cambridge Analytica was developing new hacking capacity in concert with former Russian intelligence officers.
In the year after Steve Bannon became vice president, Cambridge Analytica started deploying tactics that eerily foreshadowed what was still to come in the 2016 American presidential election. To get access to their opponent’s emails, Cambridge Analytica made use of hackers, some of whom may have been Russian, according to internal documents. The hacked emails CA procured were then used to undermine their opponent, including a concerted effort to leak rumors about the opposing candidate’s health. And this stolen kompromat was then combined with widespread online disinformation targeting social media networks. The overlap in events could be entirely coincidental, but many of the personnel who worked on Nigeria also worked on CA’s American operations. One year after Nigeria, Brittany Kaiser was appointed director of operations of the Brexit campaign Leave.EU, and Sam Patten would later go on to work with Paul Manafort on the Trump campaign. In 2018, Patten was indicted by special counsel Robert Mueller and later pled guilty to failing to register as a foreign agent. His business partner, Kilimnik, was also indicted but avoided prosecution by staying in Russia. It wasn’t until later, after Patten was revealed to be associated with suspected Russian intelligence operatives, that I wondered again about those bizarre research projects on Vladimir Putin and Crimea.
Patten also ran research in Oregon, which included an extensive amount of questioning about attitudes toward Russian foreign policy and Putin’s leadership. Why would Russia care how Oregonians felt about Vladimir Putin? Because once CA modeled people’s responses to the questions, the database could identify cohorts of Americans who held pro-Russian views. The Russian government has its own domestic propaganda channels, but one of its global strategies is to cultivate pro-Russia assets in other countries. If you’re interested in spreading your narratives digitally, it’s helpful to have a roster of people to target who are more likely to support your country’s worldview. Using the Internet to cultivate local populations with Russian propaganda was an elegant way to bypass all Western notions of “national security.” In most Western countries, citizens have free speech rights—including the right to agree with a hostile nation’s propaganda. This right serves as a magical force field for online propaganda. U.S. intelligence agencies cannot stop an American citizen from freely expressing political speech, even if the speech was cultivated by a Russian operation. Intelligence agencies can work only on preemptive actions to block weaponized narratives from an American social network.
Russia has always been contemptuous of America’s approach to free speech and democracy in general. When Russian leaders look at America’s history of mass movements and protests, they see nothing but chaos and social disorder. When they look at U.S. courts citing civil rights to permit gay marriage, they see Western decadence leading America into weakness and moral decline. To Moscow, civil rights and the First Amendment are the American political system’s most glaring vulnerabilities. And so the Russian state sought to exploit this vulnerability—to hack American democracy. It would work, they decided, because American democracy is an inherently flawed system. The Russians created their self-fulfilling prophecy of social chaos by targeting and domesticating their propaganda to American citizens of similar worldviews, who would then click, like, and share. These narratives spread through a system of constitutionally protected free speech, and the U.S. government did nothing to stop them. Neither did Facebook.
Was Cambridge Analytica involved in Russian disinformation efforts in the United States? No one can say for sure, and there’s no single “smoking gun” proving that Cambridge Analytica was the culprit, aided and abetted by Russia. But I’ve always hated the expression “smoking gun,” because it means nothing to an actual investigator. Instead, investigators compile small pieces of information—a fingerprint, a saliva sample, tire tracks, a strand of hair. In this case, Sam Patten worked for CA after working on pro-Russian campaigns in Ukraine; CA tested American attitudes toward Vladimir Putin; SCL’s work for NATO made it a Russian intel target; Brittany Kaiser used to consult for Julian Assange’s legal team; the head psychologist who was collecting Facebook data for CA was making trips to Russia to present lectures about social media profiling, one of which was titled “New Methods of Communication as an Effective Political Instrument”; CA systems were accessed by IP addresses that resolved to Russia and other CIS countries; memos referenced ex–Russian security services; and we have Alexander Nix telling Lukoil about Cambridge Analytica’s U.S. data sets and disinformation capacity.
When I had lunch with Nix to tell him I was quitting, he was clear about how he thought things would play out. “The next time you see me,” he said, “I will be at the White House. And you will be nowhere.” As it turned out, he wasn’t that far off. When I next saw Alexander Nix, nearly four years after I told him I was quitting, he was in the British Parliament, answering questions about lies he had told in a parliamentary inquiry. His reputation was being eviscerated before my eyes, but, characteristically, he didn’t seem to realize it—or maybe he just didn’t care. When he saw me sitting in the gallery, he simply winked.