“In the kingdom of glass everything is transparent, and there is no place to hide a dark heart.”
Vera Nazarian
Privacy was not always considered a good thing. During the Middle Ages, explains Alice Jane Cooley in her PhD thesis at the University of Toronto’s Centre for Medieval Studies, “there was a heightened suspicion of privacy as a state conducive to illicit activity and facilitative of sin.”701
Today, centuries later, the concern is still relevant. Consider the many crimes still being committed in secrecy: murder, robbery, vandalism, blackmail, embezzlement, you name it. In such cases, privacy-busting measures—from x-raying airline passengers to microchipping violent criminals—are needed to help maintain law and order.
Sherlock Holmes, fictional literature’s most famous sleuth, practiced his craft between 1880 and 1914, when the concept of a crime scene was just becoming popular. “Holmes throughout his adventures would constantly make reference to how crime scenes were easily contaminated and he always emphasized how maintaining the integrity of the scene was of the utmost importance,” notes real-life forensic scientist Robert Ing.702
In 1918 French scientist Edmond Locard formulated the rules informing crime scene investigations to this day. His main idea, known as Locard’s Exchange Principle, states that in the process of committing crimes, perpetrators inadvertently and inevitably leave behind and/or take with them bits of incriminating evidence. If gathered and analyzed properly, that body of evidence can be used to track a criminal down.703
As the crime scene’s importance has grown, forensic technology has greatly improved. Real-life detectives in Holmes’s day used magnifying glasses to analyze hair and fabric samples, microscopes to distinguish bloodstains from generic stains, and chemical tests to differentiate human blood from animal blood, as well as to identify a wide variety of poisons within body tissue. Those early detectives were also among the first to bring criminals to justice by using fingerprints, blood types, and the telltale scratches fired guns create on bullets.704
Today, more than ever, science is the enemy of a criminal’s privacy and hopes to evade arrest and prosecution. As attorney Nicole Chauriye puts it, “Technology is killing our opportunities to lie.”705
Here are some of today’s privacy-busting technologies helping to out even the sneakiest behavior.
Digital Cameras
• Eye in the Sky. In 2016 a man using the screen name YAOG claimed he was tipped off by a friend that his wife of eighteen years was cheating on him. Twice, he tried tailing her on foot, as she walked to work. But to no avail. That’s when he had the idea of using a tiny video camera aboard a highflying drone to get at the truth. Sure enough, he says, it worked: the aerial spy caught his wife appearing to rendezvous with, kiss, and get into the car of a local Romeo.
YAOG posted the video of his allegedly unfaithful wife on YouTube for all the world to see. “Here she is waiting at the intersection, waiting at the lights for what seems like forever, and here she is taking her hair down,” he says, in a blow-by-blow account. “If you watch carefully, you’ll be able to see eighteen years [of marriage] go down the f---ing drain!” To date the video has racked up more than fourteen million views.706
In follow-up YouTube postings, YAOG claimed he and his wife were legally separating, despite their having young children and her purportedly being contrite and wishing for a reconciliation. “I told her that can’t happen,” he reports in a YouTube video titled, Marriage Update #2.707
In November 2017, however, the couple—John and Donna from Honesdale, Pennsylvania—went public, appearing on the TV show Inside Edition to say they were reconciling. Ironically, John credits the spy-in-the-sky for saving his marriage. “If I hadn’t sent that drone up and saw what happened,” he says, “I believe that the situation [Donna’s budding extramarital affair] would have gotten more intimate.” He adds, “People aren’t perfect. People deserve second chances.”708
• Eye in the Classroom. In late 2016 Caleb O’Neil, a nineteen-year-old freshman at Orange Coast College in Southern California, was sitting in his human sexuality class, listening attentively to Professor Olga Perez Stable Cox. Without warning, she launched into a political tirade about the election of President Donald Trump. “We have been assaulted,” Cox told her young captive audience. “It’s an act of terrorism.”709 She reportedly finished up by asking Trump supporters in the classroom to stand up, then called them “cowards” when they did not.710
Using his smartphone, O’Neil videotaped most of the rant. “I pulled my phone out,” he recounts, “because I was honestly scared that I would have repercussions with my grades because she knew I was a Trump supporter.”711
O’Neil showed the videotape to the school’s chapter of College Republicans and to administration officials. Unhappy with the administration’s slow reaction, Joshua Recalde-Martinez of the College Republicans posted Cox’s political diatribe on YouTube, where it went viral. “At this point it’s not even education anymore,” says Recalde-Martinez about Cox’s polemic, “its indoctrination.”712
At first, Orange Coast College responded by supporting the professor and suspending O’Neil, claiming he had no right to videotape Cox without her consent. But after facing a firestorm of criticism by O’Neil’s attorneys and the public, the college reversed its decision and reinstated the freshman.713
• Eye on the Toilet. In 1420, on 667 acres of what is now central Beijing, Chinese Emperor Yongle built a unique campus of imperial religious buildings. The now-popular, Ming Dynasty tourist attraction is called the Temple of Heaven Park.714
In 2007 the park began offering free toilet paper in its public restrooms. But lately, too many people—especially the elderly, it is claimed—have been stealing the toilet tissue in large quantities.715
In 2017 the communist government put its foot down and installed high-definition, facial-recognition cameras to thwart the bad actors. Bathroom users must now look into the camera for three seconds, after which the smart device dispenses exactly two feet of toilet paper.
Need more?
Wait nine minutes, then look again into the camera, and you’ll get another two feet of tissue. Unless the camera’s AI software nails you as a repeat offender, in which case, you get nothing.716
After the first two weeks of operation—during which, predictably, there were many glitches—park officials reported toilet paper consumption plunged by 20 percent. The savings prompted the government to offer visitors twoply tissue, instead of the standard-issue, single-ply paper.717
For many, however, the small luxury does not nearly offset the ignominy of this latest high-tech intrusion into their personal lives. “I thought the toilet was the last place I had a right to privacy,” lamented one park visitor, “but they are watching me in there too.”718
DNA Phenotyping
In 2009 Sheriff Tony Mancuso of Louisiana’s Calcasieu Parish had only two clues to the brutal murder of nineteen-year-old Sierra Bouzigard: tissue found under her fingernails, presumably from the assailant; and the number of her last phone call.
The number led Mancuso to a band of immigrant Mexican laborers, whose DNA he collected. But none of the samples matched the DNA from under the victim’s nails. Neither did Mancuso score a bullseye with the more than sixteen million DNA profiles the FBI keeps in its Combined DNA Index System (CODIS).719
As a result, the case went cold.
Six years later the sheriff reached out to Parabon NanoLabs, a company in Reston, Virginia, which claimed it could generate photo-likenesses of suspects based solely on their DNA. Parabon’s novel technique, developed with the support of the US Department of Defense, is based on some straightforward biology.720
Our cells contain strands of DNA whose 3.2 billion molecules control the basic functions of the mind and body common to all of us. But in at least ten million places along the DNA strands, there are slight variations. They are what account for our individual differences.721
With its proprietary Snapshot DNA Phenotyping System, Parabon is able to identify tens of thousands of these telltale variations in a person’s DNA—from which it deduces key identifiers.722 These include, says Parabon’s website, a person’s “ancestry and physical appearance traits, such as skin color, hair color, eye color, freckling, and even face morphology.”723
Sheriff Mancuso was shocked when he received Parabon’s analysis of the DNA collected from under Bouzigard’s fingernails. According to it, the presumed perpetrator was not from Mexico at all, but most likely from Northern Europe, and probably had pale skin, freckles, brown hair, and either green or blue eyes. “We kind of had to take a step back and say all this time, we’re not even in the right direction,” remarked the sheriff.724
In July 2017, the new information led Mancuso to Blake A. Russell, a 31-year-old Caucasian man who looked remarkably like Parabon’s computer-generated snapshot. (See for yourself, by checking out Parabon’s website: https://snapshot.parabon-nanolabs.com/posters.)725 Russell’s DNA matched the DNA from under Bouzigard’s nails, leading Sheriff Mancuso to charge Russell with second-degree murder.726
Virtual Wiretapping
Under normal circumstances, police authorities cannot eavesdrop on our private conversations without approval from a judge. But with the proliferation of voice-activated assistants, we are effectively wiretapping ourselves [see ROBOT: MEET YOUR NEW BFF and SPY: OMNISCIENT OBJECTS OF OUR DESIRE]
Twenty-eight-year-old Eduardo Barros found it out the hard way, in the summer of 2017. While he and his girlfriend were taking care of her parents’ home in Tijeras, New Mexico, they got into a fight. Barros allegedly accused her of cheating on him and, whipping out a gun, warned if she called the police, he’d kill her right then and there.727
During the escalating fracas, his suspicions about her boiled over and he demanded to know, “Did you call the sheriffs?” Unbeknownst to him, a voice-activated assistant—the police report is unclear about which kind, but apparently it was somehow patched into the house’s surround-sound stereo system—mistook the words “call the sheriffs” as a command to do just that.728
SWAT and police negotiating teams quickly converged on the house and, hours later, took Barros into custody, charging him with fourteen crimes, including aggravated battery against a household member and possession of a firearm by a felon. “This amazing technology,” marvels Bernalillo County Sheriff Manuel Gonzales III, “definitely helped save a mother and her child from a very violent situation.”729
Virtual Stakeout
On December 23, 2017, firefighters responding to a 911 call rushed to a house in Ellington, Connecticut, where they found Richard Dabate lying on the kitchen floor, tied to a chair and bleeding. Minutes later they found his wife in the basement, dead from a gunshot wound.
For hours the police questioned Dabate about what happened. Here’s what he told them.730
During his drive to work that morning, his cell phone indicated the house alarm had gone off. Immediately, Dabate turned the car around and sped home, firing off an email to his boss saying he’d be late for work.
When he arrived at the house, Dabate dashed up to the master bedroom, where he was confronted by a six-foot two-inch masked vandal with a voice like Vin Diesel’s.731
Moments later Dabate heard his wife, Connie, returning from exercise class. He yelled at her to run, which she did. But the intruder chased her downstairs, with Dabate following close behind. The intruder caught up with Connie in the basement, shot her in the back of the head, then overpowered Richard, zip-tying him to a folding chair before fleeing.
Nearly unconscious, Dabate managed to drag himself and the chair upstairs to the kitchen. There, he triggered the house’s panic alarm and called 911, before passing out.
The Dabates’ neighbors were shocked to hear the news, all of them telling police Richard and Connie were a loving couple, the very picture of marital bliss. “They couldn’t keep their eyes off each other,” one neighbor reported. “It was a look that you would want.”732
In an attempt to reconstruct the events of that tragic day, police collected information from a number of unlikely sources—including Connie’s Fitbit, which they found on her body. Fitbits—which have become extremely popular in recent years—are wearable smart devices capable of monitoring people’s activity, exercise, food consumption, weight, sleep, and much more.733
According to the stored data on Connie’s Fitbit, she walked 1,217 feet after driving home from exercise class. This immediately caught the authorities’ attention, because Richard’s version of events had her fleeing only about 125 feet, the distance from garage to basement. Moreover, the Fitbit indicated Connie was still active nearly an hour after she was supposedly killed. Her Facebook page, too, showed her actively posting videos about forty-five minutes after she was purportedly lying dead in the basement.
From there, things went from bad to worse for Richard Dabate.
Data stored by the house’s security system showed the alarm that went off that morning was triggered from the basement by Richard’s key fob. And Richard’s Microsoft Outlook account revealed his email to the boss—allegedly sent from Richard’s car—was actually sent from an IP address corresponding to a computer inside the house.
Based on the strength of this incriminating evidence, police arrested Richard Dabate on charges of murder, tampering with evidence, and making false statements.734 He has pled not guilty and as of this writing is free on $1 million bail.735
“All human beings have three lives: public, private, and secret.”736 Alas, that seemingly tautological observation, made by Nobel Prize-winning Colombian writer Gabriel García Márquez, is no longer true. Today, the traditional walls between public, private, and secret are all but gone.
What’s more, there is an exact moment in history when the walls suffered a massive blow and began tumbling down. That moment was September 11, 2001.
Immediately after the 9/11 terrorist attacks, the stocks of most American companies tanked. But those of Visionics Corporation, a Minnesota-based company, skyrocketed. Why? Because their facial-recognition program, FaceIt, suddenly became just what the doctor ordered.737
“Recent events,” declared Visionics CEO Joseph J. Atick on October 3, 2001, “have demonstrated that an intelligence-based identification system is needed to identify criminals and terrorists. A key component for building this shield is a system like FaceIt.”738
“Before Sept. 11,” observes Jeffrey Rosen, CEO of the National Constitution Center, “the idea that Americans would voluntarily agree to live their lives under the gaze of a network of biometric surveillance cameras, peering at them in government buildings, shopping malls, subways and stadiums, would have seemed unthinkable, a dystopian fantasy of a society that had surrendered privacy and anonymity.”739
Yet, that’s precisely what happened.
Today, as we’ve seen in the chapters of this section, we live in a world surveilled by cameras, social media, and an army of electronic snoops, many of which comprise the so-called Internet of Things—soon to become the Internet of Everything. These web-connected smart devices quietly eavesdrop on our everyday lives and are mostly under the command of total strangers: corporations, governments, and hackers.
But also, our next-door neighbors.
These smart devices—too clever by half—watch us, listen to us, and evaluate our behavior constantly. And they do so, for the most part, with our permission. All in the name of entertainment, convenience, and, yes, security—as today’s post-9/11 American surveillance state vividly illustrates.
Still, the fight for what remains of our privacy is far from over.
Case in point: In 2016 the Fresno police department created a stir when dispatchers in its science-fiction–like Real Time Crime Center began using Beware, a new AI computer program, made by West Corporation in Omaha, Nebraska. Among other things, Beware tags each of us with a color-coded threat level—green, yellow, or red—based on billions of records housed on the web, including our criminal histories, website visits, and social media postings.
During the program’s trial period in Fresno, police dispatchers fed Beware’s color-coded threat assessments and other personal intel to officers responding to 911 calls.740
“Beware is location-based and quickly provides names of individuals who are most likely residing at the addresses where officers are responding,” explains Fresno police chief Jerry Dyer. “In addition, Beware can provide contact phone numbers, relatives and their contact numbers, criminal history, and hyperlinks to any public social media posts found on the Internet connected to an individual.”741
Not everyone was impressed with Beware’s prowess.
“At what point,” says the ACLU’s Jay Stanley, “does that begin to resemble China’s incipient ‘citizen scoring’ system, which threatens to draw on social media postings and include ‘political compliance’ in its credit-score-like measurements?”742
The Fresno public’s opposition to Beware was so vociferous, the city council immediately pulled the plug on it. Nonetheless, as of this writing, Beware and its manufacturer are still very much in business.743 So, beware.
As I wrap up this discussion, I see no end in sight to technology’s unprecedented assault on our privacy. Back in 1966, Supreme Court justice William O. Douglas foresaw “the fantastic advances in the field of electronic communication constitute a great danger to the privacy of the individual.”744
His prophecy has come true.
In spades.
At this very minute, for instance, psychologist Marcel Just and his colleagues at Carnegie Mellon are marrying brain-imaging technology with AI to read our minds.745 They are doing so by having people speak a sentence and noting the brain pattern it causes.
One day we’ll be able to use the information in reverse: by looking at someone’s brain patterns, we’ll be able to know what the person is thinking.746 In other words, Just’s research is gradually producing an Orwellian dictionary, which will let us look up the meanings of human brain patterns.
To be sure, the motive of the Carnegie Mellon researchers is high-minded. They wish to read the thoughts of suicidal patients, in hopes of intervening in time to save lives.747
But as we’ve seen time and again with science’s noblest intentions, there is nearly always the devil to pay on account of unintended consequences. Here, for example, it doesn’t take much imagination to foresee how corporations and governments will be able to exploit Just’s Orwellian dictionary for their less-than-noble purposes.
“I can’t imagine a system [designed] to take value readings of my mind for a remote company being used for good,” says IT specialist Chris Dancy about mind-reading technology in general. “It’s a dark path.”748
Despite its good intentions, Carnegie Mellon’s research clearly represents technology’s vanquishing blow to privacy’s last stand. For, as Pulitzer Prize-winning author Annie Dillard observes in The Maytrees: “Where is privacy, if not in the mind?”749