There are 3 universal symbols on this planet: the dollar sign $, tits, and the soccer ball.
—Po Bronson, “Game Day at San Quentin”
NOVEMBER 1, 2011
In the waning weeks of 2011, Facebook continued to strain mightily under the specter of Google Plus. Initial usage numbers, published by Google, were eye-popping, claiming hundreds of millions of users, and embodying every Facebooker’s worst nightmare of being overwhelmed by the Mountain View company’s greater engineering numbers, not to mention its still-dominant position as the default website to the world.
Professionally, I had well and truly been assimilated (at least overtly). My everyday work uniform had devolved to jeans, T-shirt, and a Facebook zipped fleece, a uniform that was conspicuous even by Facebook-grunt standards.* Workwise, I continued my Sisyphean task of somehow recycling user data to increase our low clickthrough rates. Facebook ads at that point were ugly, small, postage-stamped-sized picture-and-text combinations on the far right side of the user’s screen, mostly ignored by users. The thought of commercial content inside the News Feed was still sacrilegious, and not mentioned in polite company. The mere thought of using outside data in Facebook ads delivery was similarly heretical, and not even considered.
In this chaotic period, before the imposed revenue and product discipline of the IPO period, the Facebook Ads product team continued to move to the haphazard beat of Gokul’s leadership. PMs were often randomly bequeathed products after another product manager had left, been fired, or was transferred to some other project, whether or not they had any qualification to lead that team, or if that product should even exist. Via that product roulette, I came to manage the Ads Review and Quality team, in addition to the Ads Targeting team. Like the Department of Homeland Security, Ads Review and Quality had a grandiloquent title that couched a good amount of power, but also a fair amount of day-to-day slapstick incompetence. Similar to the noble souls who defend our borders, Ads Review (for short) was the guardian of the Facebook Ads team, sussing out obscene ads, click fraud, payment fraud, and every flavor of shenanigan where money was converted into blue-bannered pixels.
The team consisted of two overworked engineers, coding everything from the front-end user interfaces that fraud specialists used to patrol ads to the sophisticated machine-learning algorithms used to launch them onto a potential consumer’s screen. It also involved risk and fraud teams in Austin, Texas, and Bangalore, India. These operations “specialists,” who did the actual scanning of preprocessed and prefiltered photos, were trained to spot violations of Facebook Ads policy. Some violations were tame; including text in an image, which advertisers would do to cram in more alarming ad copy, was one. Some were more universal in the Bronsonian sense. One violation we memorably failed to catch was an Israeli manicuring salon that ran a photo of a woman’s very well groomed pubis. It was so sleek and almost abstract, the reviewer failed to see what it was.
In cases like this, the review function was effectively crowdsourced by the users themselves. Angry pearl-clutchers everywhere would click the X in the upper-right-hand corner of the ad, and indignantly leave feedback. The software would calculate rates of negative feedback, weighting the most egregious reasons cited most heavily—misleading, offensive, or sexually inappropriate (MOSI, for short)—and triggering a rereview of that ad. A rejection would propagate to all versions of that image inside the ads system, minimizing the amount of human intervention required, and avoiding a duplication of effort.
Unscrupulous advertisers had their wiles, however, reformatting images or changing things like colors or focus slightly, so that a bit-by-bit comparison would fail to equate an already flagged photo with a “new” image that had just been submitted. By changing the image at the bit level, they avoided the filter, even if the image looked roughly the same to the human eye. To counteract this gaming, the photo-comparison software that propagated decisions had to be made fuzzy and inexact to account for this. Machine-learning models were trained to find obvious signs of scams in ad copy (“free iPad” was one such telltale). The user interfaces were constantly improved to make the human reviewers’ tasks easier and more efficient, so that we wouldn’t need to hire more expensive humans.
It was a terrible assignment for anybody who wanted to make his or her mark at Facebook, and it would take me months to scheme myself out of it. But before that, I had to appear as the face of this ads police department, one of the airport security lines at Facebook.
Ads Review and Quality was officially part of Product and Engineering, but it worked for Sales and Operations, which was Sheryl’s grand fiefdom. Sheryl, of course, was much more than merely Zuck’s consigliera and the Ads team’s intercessor within the senior management stratum of the company. She was the able leader of this vast, multitiered organization, with an ever-shifting cast of names and titles spanning the geographically fragmented organization. This world encompassed everything from senior ad executives closing deals with Coca-Cola to junior-user operations people deleting a fake account. In many ways, this was the boiler room of the Facebook money machine, or at least its human-powered factory floor, and Sheryl was its unquestioned overseer.
Every quarter Sheryl scheduled a gargantuan meeting meant to show off the wonderful tools engineering was building for Sales, and how well the hybrid engineering-operations teams were collaborating. Sheryl’s managerial prowess was on full display in these powwows, as she skillfully held court among the assembled lieutenants of the various subrealms. Picking up subtle psychological cues from an off-the-cuff remark here, unearthing some lingering issue that lay dormant over there, making sure every voice was heard but no voice heard too much, tamping down some burst of irrelevant drama to keep the action moving, the woman knew how to run a roomful of big names and even bigger egos.
The meeting was held in the PC Load Letter conference room, which was one of the cavernous spaces used for only the largest or most senior meetings.* Sheryl sat at the fifty-yard line of the football-field-sized table, with Don Faul at her right hand. Faul was a former Marine platoon leader and Googler who ran Online Operations, the busy human workflow of keeping the nontechnical side of an ads machine going. He resembled a more strapping version of Don Draper.
I sat at the twenty-five-yard line, near the screen where we’d be projecting. The room started filling quickly with pairs or triplets of product managers, eng managers, and ops managers. Mark Rabkin was there as my engineering analogue, one of the first engineering hires in Facebook Ads, and a man who’d soon assume a real importance in the organization. Also there was David Clune, the operational head of the Austin-based ads police, and the one who had done most of the work on the slides I’d be presenting.
First up in the Sheryl show was a product manager named Dan Rubinstein. Dan resembled a Woody Allen figure: short, thin, nebbish, but without the crackling anxiety. Also a former Googler, he seemed like one of those old PM hands who always made sure to take good notes and get his weekly report in on time. He fronted for User Ops, which was the user police, and the user-facing version of what I did on the ads side. Ever wonder why your feed never features any form of porn or otherwise grotesque imagery? It’s because a team in User Ops has managed to sift through the billion photos uploaded a day, and pick out a pile of offensive needles in an Internet scale haystack.
On the screen now, Dan launched a demo of a tool that was essentially that: on loading the Web app, a raft of user photos appeared, which a User Ops “analyst” could easily click to eliminate, like plucking weeds from a garden. That image would be banished forever, including versions with small color changes or cropping done by veteran spammers and sketchy ad types. As he walked the room through the demo, he would click on an image of a kitten—kittens evidently represented the porny pictures they’d normally filter—and that kitten would be gone, as well as all variants of that kitten image. Click, ban, reload, click, ban, reload. A well-oiled kitten-banning machine, ladies and gentlemen.
Suddenly Sheryl interrupted: “So, what’s with all the kittens?”
Dan, a bit startled, peered at Sheryl, clearly confused.
“Why are all the bad photos kittens?”
Dan flatly replied, “We use kittens as the bad photos in demos, because the real bad photos are . . . you know . . . kind of obscene.”
“Right,” said Sheryl, “but why kittens and not something else?”
The room was deathly silent with thirty-plus sets of twitchy eyes rising from barely concealed phones and laptops to stare at Dan and his kitten-banning machine. You could almost hear everyone mentally asking in chorus: Yeah, what is it with the kittens?
Dan looked up at the screen as if noticing the kitten pics for the first time, and then turned to Sheryl and answered, almost under his breath:
“Well . . . for demo purposes we don’t show really bad photos . . . so the engineers use kittens instead. Because, you know . . . kittens and cats are like, pu—”
He stopped right there, but he almost said “pussy” in front of the Queen of Lean, Sheryl Sandberg.
“Got it!” she expectorated. After sucking in a lungful of air, as if loading for a verbal barrage, she continued. “If there were women on that team, they’d NEVER, EVER choose those photos as demo pics. I think you should change them immediately!”
Before the salvo had even finished Dan’s head was bowed, and he was madly taking notes in a small notebook. CHANGE PUSSY PHOTOS NOW! one imagined they read. He looked like a forty-year-old scolded child.
I was dying inside. You could feel either awkwardness or repressed laughter seething from everyone in the room at this unprecedented display of management wrath and PM folly. Demoing the pussy filter to Sheryl. Epic!
Dan limped along with the rest of his demo, and then it was my turn. After that high-water mark of incompetence, it was hard to fuck things up. I glided through the slides, lingering on the money shot: a plot of the number of ads reviewed versus human man-hours. The former was up and to the right (MOAR ADS!), the latter was flat (fewer expensive humans!). All was right with the Ads Review world. I drowsed through the other presentations and bolted at the first opportunity.
Ads Review was but one of the security teams at Facebook charged with the monumental task of protecting one-fourth of the Internet globally—which is what Facebook represents—from scammers, hucksters, pornographers, sexual predators, violent criminals, and every kind of human detritus. It’s a noble struggle (despite my lack of enthusiasm for engaging in it myself), and one whose combatants work mostly in the shadows. As with all police or spy agencies, the failures of the Facebook security teams were widely trumpeted, but successes rarely heralded. You moan about your friend’s breast-feeding photo being flagged, but fail to notice the complete lack of porn in your News Feed. It’s a thankless job that appeals to those with a certain shepherd-dog mentality, or simply to people who, Dexter-like, are themselves rogues and black-hat hackers who’d rather use their skills for good.
Our social shadow warriors did have one showcase: there was an internal Facebook group with the provocative name of “Scalps@Facebook.” It was essentially an online trophy case of taxidermied delinquents, the sexual predators, stalkers, and wife-beaters whom the FB security team, in conjunction with law enforcement, had managed to catch. The weird thing about it was that the posts featured profile photos of the alleged criminals, along with somewhat gloating accounts of how the “perp” had been hunted down, and which local law enforcement agencies had collaborated. And so Facebook employees would randomly see in their feed the guilty face of depraved desire: some guy in the Philippines or Arkansas, and his rap sheet about trying to induce fourteen-year-old girls to meet him, along with a line or two indicating he had been summarily dispatched into the maw of the legal system.
So why doesn’t Facebook make more of this safeguarding role?
I don’t know the official answer, but I can speculate. If Facebook were to publicize its very real efforts at stopping crime, people might associate the blue-framed window with the thought of some predatory creeper. As is, many people have an ambiguous relationship with Facebook.
Just imagine the headline: “Facebook Catches 36 Sexual Predators This Month.” Some bespectacled fortysomething mother in Wisconsin, hair in a bun and pearls tightly gripped, announces to her husband: “Honey, I just know we should get Meghan off that Facebook thing, look, it’s just crawling with perverts.”
This is, of course, a ridiculous view. Nobody thinks AT&T should be shut down when criminals use the phone system to commit a crime, or the US Postal Service regulated when a terrorist sends a bomb through the mail. But the average Facebook user considers the service to be some sort of frivolous toy, rather than a social utility on par with running water, and therefore thinks we can just shut it down if it seems to harbor any hint of criminality.
Like the CIA not exactly advertising the drone strike that vaporized a vehicle in some godforsaken land and prevented the next terrorist tragedy from happening, Facebook keeps quiet all it does for users to protect them from humanity’s worst. Whine if you must about the odd erroneously flagged post, but spare a thought for the Facebook security team, those dedicated geeks in the watchtower. They’ve likely put away as many (if not more) bad guys than your local law enforcement agency, and they keep their vigilant guard with nary a thanks from users. Perhaps just once, marvel at your Facebook experience, at its almost total lack of pornography, spam, hate speech, and general human detritus, and consider what spectacular systems and expertise must exist for a few hundred people to safeguard the online experience of 1.5 billion users, fully a fifth of humanity, on a continual, 24/7 basis.