LIVING IN A FISHBOWL

“You have zero privacy . . . Get over it!”

Scott McNealy

In the comedy movie Lost In America, David Howard—a yuppy ad exec played by comedian Albert Brooks—is denied a promotion. Instead, his boss offers him a lucrative account headed up by a bald Manhattanite. Howard refuses, is fired, then stomps out of his boss’s office, screaming, “Don’t have lunch with this man! . . . He’ll tell you all about the future—how good the future’s going to be here. I’ve seen the future! He’s a bald-headed man from New York!”569

Likewise, if you want to know where personal privacy is headed in our world, I’m here to tell you, “Don’t believe the hype from scientists promising a utopian future. I’ve seen the future. It’s the technology-intoxicated citizens of China!”

After decades of living under Communism, the Chinese are used to having very little privacy. According to Yang Wang, a Syracuse University web expert, the most common Chinese word for privacy—yinsi—didn’t even appear in popular dictionaries until twenty years ago.570

But today’s lack of privacy is worse than ever, largely thanks to facial recognition technology, the powerful marriage of cameras and artificial intelligence. In the typical system, a camera records a person’s image and AI software translates the image into a set of telltale measurements. For instance, the distance between the eyes and between the tip of the nose and upper lip, the width of the nostrils, the depth of the eye sockets, the shape of the cheekbones, the length of the jaw line, and so forth.571

That set of numbers constitutes a unique faceprint that is stored in a local computer or remote cloud for easy reference. The system’s AI considers it a match if the numbers of two faceprints are identical or close to it. As an oversimplified example, the numbers {12, 3, 9} and {12, 2, 9} would be flagged as a possible match but not the sets {12, 3, 9} and {5, 3, 10}.

In the old days, photos were compared by eye. The labor-intensive process was severely restricted not just by time requirements but by poor picture quality and indirect camera angles.

Today’s facial-recognition technology is lightning fast. NeoFace— currently a leading facial recognition program, created by NEC—can scrutinize more than three million images per second!572 And it does so while compensating for grainy or distorted images, shadows, and different camera angles.

Recently, NeoFace easily beat the competition in a test to spot a single, target person sitting in a packed stadium far away from the camera, frequently turning his face this way and that.573 “NEC is very pleased to be awarded such a high-profile honor,” says Raffie Beroukhim, a senior VP at NEC. “The award validates the visionary innovation of our NeoFace facial recognition portfolio and solutions, in addition to the impact and value it has brought to our government and public- sector clients.”

Those clients presently include at least a dozen stores and hotels, which use NeoFace to constantly monitor the faces of customers entering their premises. NeoFace quickly compares the incoming live images with photos of high-value customers—and is not easily fooled, even if a VIP is traveling incognito, wearing a hat and sunglasses. “If a match is found, usually within a split second,” explains an NEC booklet on facial recognition titled “It’s All About the Face,” “the sales assistant is alerted on the smartphone or iPad to provide personalized service.” The NeoFace dossier includes “the customer’s dress size or other preferences gleaned from previous purchases.”574

Smart cameras now exist that can even read our mood, based on our facial expression—and in some cases, our sex and age as well. The best ones are from MIT’s media lab,575 Affectiva,576 Emotient (which was purchased by Apple in 2016),577 and Dahua, a major Chinese company.578

Look around the next time you leave the house. The emotion-reading cameras are being used by marketing firms to analyze how consumers react to products and advertisements, doctors to diagnose the mental and emotional states of patients, families to help communicate with autistic loved ones, and retailers to watch shoppers’ faces as they peruse the store shelves.579

Some facial-recognition programs—including one recently developed by Baidu, China’s answer to Google—can even compare photos of a person taken at different ages. In a moment, we’ll see a dramatic example of its usefulness.580

Facial-recognition technology is also being trained on cars. Deep-Glint, a Beijing-based software company, sells AI systems that can scrutinize security-cam video for automobiles of any specified color, make, and model.581

I’VE SEEN THE FUTURE

Today, as I indicated, the explosive impact of facial-recognition technology is most evident in China. “In China, facial-recognition technologies are as good as those developed in western countries,” says Wang Shengjin, an electrical engineer at Tsinghua University. “But we are far ahead when it comes to deploying it commercially.”582

In the name of entertainment, convenience, and security, Chinese citizens submit to being photographed everywhere they go. And I mean, everywhere.

Convenience

Chinese enthusiastically use facial-recognition technology to unlock their cell phones and enter offices, hotels, schools, planes, trains, and taxis. All of which is quite convenient.

China Merchant Bank operates ATMs that scan people’s visages before dispensing money. The technology is sophisticated enough to tell the difference between a real, 3D person’s face and a fake, 2D photo of a person’s face. “We won’t need to remember another password,” crows Xu Bing, co-founder of SenseTime, a Beijing-based, facial-recognition technology company. “All you’ll need to do to unlock your phone or log in to an account is scan your face.”583

A pioneering facial-recognition app developed by Ping An Bank— reportedly the first of its kind in the world—even allows people to apply for a speedy loan using just their faces as identification. “The technology is based on a complex neural network,” explains the bank’s press release, “achieving facial recognition with even greater accuracy than the human eye—99% compared to 97.5%.” Moreover, the app supposedly “compensates for the natural aging process and is even able to differentiate between twins.”584

Not to be left behind, kiosks at a KFC in Beijing take photos of customers and use AI to guess their sexes and ages. Based on that information, the smart kiosks recommend menu items KFC thinks the customers will like. According to Baidu, which created KFC’s AI software, a young male is likely offered “crispy chicken hamburger, roasted chicken wings and coke.” Whereas a middle-aged female is typically presented with the choice of “porridge and soybean milk for breakfast.”585

Entertainment

Face++, a Chinese startup, has created a suite of fun face-detection apps. They include Camera 360, which picks out every face in a photo and frames it in a box, to which you can attach any kind of highly personal information and store for later use. The same is true of another popular app, Meitu586

Security

Predictably, Xi Jinping’s Communist government is making full use of facial-recognition technology—all in the name of security. Such intense video surveillance is necessary, goes the official party line, to keep honest, law-abiding citizens safe from bad guys.

Presently, there are 176 million surveillance cameras in China, a number Xi’s administration openly plans to increase by another 450 million by 2020.587 In Beijing alone there are 4,300 officers constantly monitoring the live feeds of more than 46,000 cameras—reportedly enough to keep an eye on every square inch of the city.588

By age sixteen, every Chinese citizen is required to obtain a government-issued photo ID. All the photos are stored in a vast electronic library that helps authorities keep China’s 1.4 billion men, women, and children on a short leash. And I do mean short.

At least sixteen Chinese provinces, cities, and municipalities use something called Skynet, a security system that marries China’s vast network of surveillance cameras and photo libraries with the very latest in facial-recognition technology. Reportedly, Skynet can identify forty telltale features on a person’s face and scan three billion photos in one second with 99.8% accuracy!589

In many cities, if you jaywalk near a major intersection, a camera will catch you in the act and instantly post your photo on a giant electronic screen in order to shame you. Also, your photo will be transmitted to the police, who will then use facial-recognition software to quickly scan their vast database of government photo IDs and find out exactly who you are.590

“For frequent offenders,” a China Daily news article explains, “the system will add them into a list connected with credit-based services of government departments and banks.” That is, the credit scores of serial jaywalkers will take a hit, with all the unpleasant consequences that entails.591

In Shenzhen, authorities are going one step further, by teaming up with cell phone companies and powerful social media platforms such as WeChat and Sina Weibo. According to their plan, if you jaywalk in the city, you will not just be publicly named and shamed but instantly sent a text—and possibly a ticket—from the police. “Jaywalking has always been an issue in China,” says Wang Jun, marketing director for Intellifusion, a Shenzen-based AI company. “But a combination of technology and psychology . . . can greatly reduce instances of jaywalking and will prevent repeat offences.”592

In February 2018, LLVision Technology, a Beijing-based company, upped China’s AI surveillance capabilities even further. Its sunglasses fitted with small, facial-recognition cameras connected to handheld photo databases now allow police officers to spot criminals and other persons of interest anywhere, any time. “By making wearable glasses, with AI on the front end, you get instant and accurate feedback,” says Wu Fei, LLVision’s CEO. “You can decide right away what the next interaction is going to be.”593

To be fair, China’s prolific use of cameras and facial-recognition technology does have its upsides. In 1990 six-year-old Fu Gui was kidnapped and sold to foster parents a long way from home. Tragically, this is a Chinese cultural phenomenon that victimizes more than 20,000 Chinese youngsters annually.

As a grownup, Fu Gui was naturally curious about his biological roots. So, he submitted a photo of himself at ten to a nonprofit group called Baobei Huijia, Chinese for “Baby Come Home.” Coincidentally, his biological parents submitted childhood photos of him as well. Using Baidu’s facial-recognition technology, Baobei Huijia successfully reunited Fu Gui and his parents after twenty-seven long years of separation.594

For each such uplifting story, however, there are many more that drive home the dark aspects of China’s monumental indifference to privacy. Stories that should serve as warnings to the rest of the world.

Stories like this one.

In 2017 China’s State Administration of Religious Affairs began installing surveillance cameras inside and around prominent houses of worship. “Officially, the reasoning for the cameras are ‘safety’ and ‘antiterrorism’ precautions,” reports China Aid, a Texas-based, nonprofit, religious-rights association.595

In the east-coast industrial port city of Wenzhou—which has the highest concentration of Christians on the mainland—Christians vigorously resisted the heavy-handed surveillance, but to no avail. “Government officials came to the churches and put up cameras by force,” reports one unnamed Christian in the city. “Some pastors and worshippers who didn’t agree to the move were dragged away.”596

GOING THE WAY OF CHINA

At this point, I’m guessing you might be thinking, “But that’s Communist China. This kind of invasion of privacy would never happen in the United States.”

Is that so?

I agree it’s not very likely Americans will ever see churches surveilled in such a Draconian manner. But make no mistake, ordinary Americans by the millions are falling for the same three excuses—entertainment, convenience, and security—to extinguish personal privacy in unprecedented and irreversible ways.

In 2017 Apple released its new iPhone X, whose big selling feature is Face ID, a sophisticated facial-recognition algorithm that lets users unlock their phones simply by looking at their screens.597

Convenient?

Yes, of course.

But despite Apple’s claim that our facial profiles are completely protected, Reuters quickly learned that Apple grants third-party app developers access to our faceprints. “With the iPhone X,” the Reuters report warns, “the primary danger is that advertisers will find it irresistible [to use our faceprints, together with mood-reading AI technology] to gauge how consumers react to products or to build tracking profiles of them . . .”598

JetBlue Airways is presently experimenting with facial-recognition technology to replace boarding passes and passports on its flights from Boston to Aruba. “We hope to learn how we can further reduce friction points in the airport experience,” says JetBlue executive Joanna Geraghty. “Self-boarding eliminates boarding pass scanning and manual passport checks. Just look into the camera and you’re on your way.”599

Finally, as I write this, the US Department of Homeland Security (DHS) is announcing its wish to use facial-recognition cameras to scan people’s faces as they cross our borders. The department’s Science and Technology Directorate wants to identify border crossers reliably, even if they are wearing hats, sunglasses, fake beards, or looking away from the camera—and without requiring people in cars to stop or slow down.

The intended targets are foreign nationals, to make sure those entering the United States are the same ones leaving and not imposters. But everyone crossing our borders will be photographed, including law-abiding citizens.600

For the record, there’s about a one-in-two chance your photo is already in the FBI’s Next Generation Identification-Interstate Photo System (NGI-IPS), a massive, centralized database comprising some 411 million facial images and powerful facial recognition software.601

Over the years, the bureau has been quietly aggregating photos from many sources, including police departments, security cameras, departments of motor vehicles, and the State Department (which processes passport and visa photos). Yet, the FBI did not publicly reveal NGI-IPS’s existence until 2015.602

News of the NGI-IPS did not sit well with the US Congress, which held a hearing about it in March 2017. “I’m frankly appalled,” remarked Representative Paul Mitchell (R-MI). “I wasn’t informed when my driver’s license was renewed my photograph was going to be in a repository that could be searched by law enforcement across the country.”603 Not even fingerprints are stored away in such a sneaky way.

“This is really Nazi Germany here, what we’re talking about,” Congressman Stephen Lynch (D-MA) averred. “I have zero confidence in the FBI . . . to keep this in check.”604 Alvaro Bedoya, a witness at the hearing and coauthor of The Perpetual Lineup: Unregulated Police Face Recognition in America, agreed.605 “No federal law controls this technology, no court decision limits it. . . . This technology is not under control.”606

“Imagine the world where the cops are going down the street and they’ve got Google Glass on and their body cameras are recognizing people,” warns Barry Friedman, director of NYU’s Policing Project. “And it’s not just recognizing them, but they’re getting their security scores at the same time . . . based on how dangerous the algorithms think they are. That’s one scary world.”607

THE EYES HAVE IT

If you’re still skeptical the United States could ever go the way of China, consider this simple fact: according to analysts at IHS Markit, there are already about 50 million surveillance cameras in the United States.608 That’s about one camera for every 6.5 Americans—worse than China’s one camera per eight citizens.609

IPVM, a prominent video-surveillance information organization, says most Americans are clueless about video surveillance. According to a study it commissioned, 55 percent of Americans guessed they were seen by security cams about zero to four times per day. In fact, the actual number is more than fifty times per day!610

IPVM illustrates the point, with a typical day in modern-day America:

8:00 am: Get coffee (cameras in Starbucks, Dunkin Donuts)

8:30 am: Go to school/office (cameras in parking lot and building interior)

12:15 pm: Stop at ATM before lunch (cameras around bank exterior and at ATM)

12:30 pm: Have lunch (cameras at eatery and surrounding businesses)

5:00 pm: Go to gym after work (cameras at check-in desk and workout areas)

5:45 pm: Pick up dry cleaning (camera at front register)

6:00 pm: Stop for gas (cameras at pumps and in convenience store)

6:15 pm: Car wash (cameras at entryway and inside washing bay)

7:00 pm: Pick up kids from practice/game (cameras in parking lot and building exterior)

This inventory doesn’t even include time spent on US highways, where traffic cams are pervasive these days. And it also doesn’t include malls and other enclosed shopping centers, which are plastered with security cameras, the combination of which can literally follow you from store to store.

In the surveillance industry, doing exactly that is called Tag and Track. A camera at one location uses AI to spot you in a crowd, then keeps its eye on you, right up until the time you enter the view of another AI camera. “The notion that you can tag a person and let the system do the tracking is a dream come true for CCTV [closed-circuit TV] operators,” raves Sergio Velastin, a scientist at England’s Kingston University, one of the key places where the Tag-and-Track technology was developed.611

The proliferation of surveillance cameras in the United States (and elsewhere) is partly driven by the explosive popularity of low-priced home security systems from companies such as Ring612 and Google’s Nest Cam.613 “These units mounted at my house are equipped with a 43 optical zoom, night vision, and motion activation technology,” reports Kentucky newspaper columnist Kevin Moore. “I now know all my neighbors’ work schedules, how many cats, rabbits and coyotes live in my neighborhood, who lets their dog relieve themselves in my yard, and that the police officer down the street works second shift and rolls in just before midnight most evenings.”614

When you compare not entire nations, but individual cities, some Brits enjoy even less privacy than some Yanks and Chinese. In fact, according to an analysis by WorldAtlas, “London is the most spied-on city in the world.”615

It is estimated the city is blanketed by about 500,000 security cameras616—which is about one camera for every fourteen Londoners.617 That’s far less privacy than a person gets in Beijing, the world’s second-most-spied-on city, where there is one camera per 472 citizens.618 (According to WorldAtlas, Chicago and New York City are the third and fourth most spied-on cities in the world.)

London’s ubiquitous cameras even spy on the city’s cars. To date, the so-called Automatic Number Plate Recognition (ANPR) initiative has taken some twenty-two billion photos of front and back license plates.619

“As a vehicle passes an ANPR camera,” the London Police Department website explains, “its registration number is read and instantly checked against database records of vehicles of interest. Police officers can intercept and stop a vehicle, check it for evidence and, where necessary, make arrests.”620

An average of 300 security cameras capture the images of a typical Londoner per day—surely, an Orwellian statistic.621 “We are sleepwalking into a surveillance society where we’re watched from control rooms by anonymous people,” says Emma Carr, past director of Big Brother Watch, a London-based civil rights group.622

Many see advantages to the tight surveillance, especially in this age of terrorism, and are more than willing to sacrifice privacy for security. “I think it’s a good thing, especially at night, to think someone is keeping an eye on things,” says Londoner Jane Taylor. Fellow city dweller Nadine Shah, a bank worker, feels the same way. “If you’re not doing anything wrong, you’ve got nothing to worry about have you? If they [the cameras] deter crime and help the police, I don’t see that being a problem. People say it’s like ‘1984’ but it’s a long way from that.”623

But is it, really?

Think back to the FBI’s NGI-IPS, which comprises powerful facial recognition software and a massive database of peoples’ photographs. According to the bureau’s unpublished standard of accuracy—obtained by the Electronic Privacy Information Center, a nonprofit, Washington, DC-based watchdog group—“NGI shall return an incorrect candidate a maximum of 20% of the time.”624 It means there’s a one-in-five chance some innocent American—possibly you or I—could be implicated in a criminal investigation erroneously.

In a study titled, “Error Rates in Users of Automatic Face Recognition Software,” researchers at Australia’s University of New South Wales report evidence that suggests the error rate of the FBI’s facial-recognition system is considerably more than 20 percent.625

Truthfully, says Diana Maurer, director of Homeland Security and Justice Issues at the US Government Accounting Office (GAO), the FBI “doesn’t know how often the system incorrectly identifies the wrong subject.” Consequently, she says, “Innocent people could bear the burden of being falsely accused, including the implication of having federal investigators turn up at their home or business.”626

In 2016 the GAO published a scathing critique of the NGI-IPS titled, “Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy.”627 The FBI issued a ten-page reply, saying:

“The FBI fully recognizes that the automated nature of face recognition technology and the sheer number of photos now available for searching raise important privacy and civil liberties considerations. For that reason, the FBI has made privacy and civil liberties integral to every decision from the inception regarding its use of face recognition technology.”628

But in 2016 the FBI asked that its entire NGI databank—not just photos, but fingerprints, palm prints, iris scans, and all—be exempt from parts of the Privacy Act of 1974. The venerated act protects our privacy by severely restricting “the collection, maintenance, use, and dissemination of information about individuals that is maintained in systems of records by federal agencies.”629

“The NGI system is a law enforcement database,” observes Pindrop, an international security consulting firm with headquarters in Atlanta and London, “but it contains records from a variety of non-law enforcement sources. It has fingerprints and other biometric identifiers from some employment records, humanitarian and relief efforts, and from some foreign sources.”630

In August 2017, to the dismay of privacy advocates, the FBI got the exemptions it wanted to keep the NGI’s contents secret. As a result, we Americans are now forbidden from knowing if our photos, fingerprints, palm prints, iris scans, personal information—you name it—are in the NGI. Moreover, if it is, and any of it is wrong, we are forbidden from correcting it.631

Predictably, the FBI invokes security to defend the new privacy-busting rules. “The listed exemptions,” states the official US Department of Justice proclamation, “are necessary to avoid interference with the Department’s law enforcement and national security functions and responsibilities of the FBI.”632

NO GOING BACK

We’re not yet China, thank goodness. But as we’ve seen, even in America, the vaunted bastion of liberty, there are very few places left anymore where we aren’t on camera. Even when roaming the nation’s proverbial wide-open spaces, we’re likely to end up in photos or video taken by nearby tourists brandishing their smartphones. They’re everywhere these days.

“One hundred years ago, everyone could have personal privacy,” remarks Bruce Schneier, a renowned privacy specialist and fellow at Harvard University’s Berkman Klein Center.633 “You and your friend could walk into an empty field, look around to see that no one else was nearby, and have a level of privacy that has forever been lost.”634

That’s why I began this chapter by saying, “I’ve seen the future.” Not because I believe America is in any real danger of falling victim to Communism. No. Right now, I believe, the single greatest threat to privacy in America is our giddy infatuation with technology—in particular, face-recognition technology. By blithely embracing today’s brave-new-world innovations in the name of entertainment, convenience, and security, we seem to be voluntarily progressing toward the equivalent of a totalitarian government.