12Aging Computers

I do not plan to run Bailiwick forever. At some point, I will take it offline, archive it, and move on to another software project. Like a car or a plant or a relationship, software needs care and constant attention. It also has a lifespan.

Websites and apps and programs always break because the computers that they are on wear out and need to be updated. The world changes. Software needs to be updated. When you host even a simple website with a company, that company will always go through management changes or be sold or upgrade its servers, and something will inevitably be screwed up. Every year that you run a software project, you accumulate technical debt—the cost of maintaining the current software and adding on patches and fixes. In a New York Times editorial, professors Andrew Russell and Lee Vinsel wrote that 60 percent of software development costs are spent on routine maintenance like bug fixes and upgrades.1 Contrary to popular imagination, the enormous number of engineers and software developers that we’re projected to need in the workforce in the future are not needed for new and innovative projects; 70 percent of engineers work on maintaining existing products, not making new ones.

The problem of maintenance is a good reminder that the digital world is no longer new. Like the pioneers of the first dot-com boom, it’s middle-aged. If we consider the digital age to have begun with Minsky or Turing, the era is positively elderly. It’s time to be more honest and realistic about what goes into technology and what it takes to keep technology working. I’m optimistic that we can find a path forward that uses technology to support both democracy and human dignity.

Mistakes were made. This could be the refrain of the media industry in dealing with the digital revolution. It could also be the refrain of the tech industry in dealing with the digital revolution. The trick is to understand the past so that we don’t make the same mistakes going forward.

One thing we can do is stop referring to tech as new and shiny and innovative and instead consider it an ordinary part of life. The first computer, ENIAC, launched in 1946. We’ve had half a century to figure out how to integrate technology and society. That’s plenty of time. Yet, after all this time, I regularly attend tech meetings in which the first ten minutes are spent awkwardly waiting while someone figures out how to use the projector to get a PowerPoint presentation to show up on the screen. Thus far, we’ve managed to use digital technology to increase economic inequality in the United States, facilitate illegal drug abuse, undermine the economic sustainability of the free press, cause a “fake news” crisis, roll back voting rights and fair labor protection, surveil citizens, spread junk science, harass and stalk people (primarily women and people of color) online, make flying robots that at best annoy people and at worst drop bombs, increase identity theft, enable hacks that result in millions of credit card numbers being stolen for fraudulent purposes, sell vast amounts of personal data, and elect Donald Trump to the presidency. This is not the better world that the early tech evangelists promised. It’s the same world with the same types of human problems that have always existed. The problems are hidden inside code and data, which makes them harder to see and easier to ignore.

We clearly need to change our approach. We need to stop fetishizing tech. We need to audit algorithms, watch out for inequality, and reduce bias in computational systems, as well as in the tech industry. If code is law, as Lawrence Lessig writes, then we need to make sure the people who write code are doing so in accordance with the rule of law. Thus far, their efforts in self-governance have left much to be desired. We can learn from the past, but first we need to pay attention to it.

A few projects in journalism and academia suggest that a new, more balanced view of AI is on the horizon. One such project is the AI Now Institute, a policy group at NYU founded in 2017 by Kate Crawford of Microsoft Research and Meredith Whittaker of Google. Funded by Silicon Valley, the group began as a joint project of President Obama’s White House Office of Science and Technology Policy and the National Economic Council. AI Now’s first report focused on the near-term social and economic issues that arise from artificial intelligence technology in four fundamental areas: healthcare, labor, inequality, and ethics. Their second report issued a call to arms “for all core public institutions—such as those responsible for criminal justice, healthcare, welfare, and education—to immediately cease using ‘black box’ AI and algorithmic systems and to move toward systems that deliver accountability through mechanisms such as validation, auditing, or public review.”2 At Data & Society, another think tank, danah boyd is leading a group that seeks to understand and raise awareness of the role of humans in AI systems.3 The “good selfie” experiment in chapter 9 could have benefited from a more nuanced understanding of the social context of the people making each selfie popular (and the people making the experiment). Another area worth examining is the human system for banning explicit content from social networks. Whenever violent or pornographic content is flagged online, a human must look at it and judge whether it’s a video of a beheading or a photo of an object inappropriately inserted into an orifice or some other example of the worst of humanity. The psychological effects of watching streams of filth every day can be traumatic.4 We should interrogate this practice and make a decision together, as a culture, about what it means and what should be done about it.

Inside the machine-learning community, there have been moves toward greater understanding of algorithmic inequality and accountability. The Fairness and Transparency in Machine Learning conference and community is a leader in this area.5 Meanwhile, Harvard professor Latanya Sweeney’s Data Privacy Lab at the Harvard University Institute for Quantitative Social Science is doing groundbreaking work in understanding potential violations of privacy in large data sets, especially medical data. The lab’s goal is to create technology and policy “with guarantees of privacy protection while allowing society to collect and share private (or sensitive) information for many worthy purposes.”6 Also in Cambridge, the MIT Media Lab under director Joi Ito is doing admirable work to change the narrative about racial and ethnic diversity in computer science and to start interrogating systems. Prompted by MIT graduate student Karthik Dinakar’s work on human-in-the-loop systems, MIT Media Lab professor Iyad Rahwan has begun working on what he calls society-in-the-loop machine learning, which he hopes to use to explicitly articulate moral concerns (like the trolley problem) in AI. Another project focuses on ethics and governance of AI, spearheaded by the MIT Media Lab and the Berkman Klein Center at Harvard and funded by the Ethics and Governance of Artificial Intelligence Fund.

And of course there are data journalists, who deliver at a high level despite all of the cuts to the industry. A number of remarkable tools enable sophisticated analysis of documents and data. DocumentCloud, a secure online repository for documents, contains 3.6 million source documents as of this writing and has been used by more than 8,400 journalists in 1,619 organizations worldwide. DocumentCloud is used by small and large news organizations worldwide, and it has hosted documents for high-impact stories like the Panama Papers and the Snowden documents.7 The number of data journalists worldwide is increasing slowly. The annual NICAR conference, an annual meeting for data journalists, had over a thousand attendees for the first time in 2016. Every year, awards like the Data Journalism Awards reward truly impactful investigative-data projects. There is cause for optimism about the future.

This book has covered a lot of ground in the history and the fundamentals of today’s computing technology. As I was thinking about ways people understand computers, I decided to visit the place where computing began: the Moore School of Engineering, on the University of Pennsylvania campus. There, tourists can visit the remains of the ENIAC, considered the first digital computer. In a sense, ENIAC’s home was where I began too. I was born in the hospital next door to it. My parents began dating as graduate students at Penn in the 1970s. They spent a lot of time together at the campus computer center. They would arrange their punch cards into boxes, then go down to the computer center and wait their turn to put their cards through the input device to perform a statistical experiment on the mainframe computer. If you dropped the cards, my mom once told me, it was all over: good luck putting them back into the right order.

Outside the Moore building, there is a commemorative sign for the ENIAC that uses the same font as the other Philadelphia historical markers at places like Betsy Ross’s house. On this day, it was crisp and clear. Dozens of high school students in dark suits and dresses streamed past me on the sidewalk. Some clutched white three-ring binders marked Model Congress. A few of the boys wore ski jackets over their unaccustomed suits. One had a camouflage jacket with fur around the hood. I heard a girl complaining to her friend, “I don’t even know how to walk in heels,” as she shuffled down the sidewalk in black suede pumps. They were cheerful, scrubbed, having a great time feeling like grownups on a college campus, without a teacher in sight. I was reminded how events like Model UN (uncomfortable clothes and all) helped my friends and me transition to adulthood. Dressing up like grownups and participating in simulated workplace activities was one of the ways we learned how to be functional adults. I suppose it would be possible to replace these kinds of experiences with videoconferences or live chats; it would be boring, however, and the data density would be low. It’s hard to imagine that teenagers would want to do it.

I didn’t have an access card, so I hung around the building entrance until an undergraduate with an access card showed up. He glanced at me, decided I wasn’t a threat, and went back to the conversation he was muttering on his phone. I went inside and wandered, a bit lost, through the warren of hallways. I passed labs: the rapid-prototyping lab, the precision-machining lab, each filled with hulking drill presses and 3-D printers and bits of giant machinery in various states of disrepair. Engineers are hard on equipment. A mechanical-engineering student took pity on me and walked me to where ENIAC lived, in a student lounge behind wooden double doors on the first floor. Only a few panels are on display. A sign in a 1960s font proclaimed, “ENIAC, The World’s First Electronic, Large- Scale, General-Purpose Digital Computer.”

In the lounge, there were three diner-style wooden booths for student collaboration and three more bar areas with barstools for noncollaborating students. A stack of Facebook promotional postcards was strewn across a bar next to a computer and printer. “As a software engineer at Facebook, whether an intern or recent graduate, you will write code that impacts over 1.4 billion people around the world,” a card promised. Apparently, Facebook was looking to recruit people who move fast, build things, take risks, and solve problems. “Connecting the world takes every one of us,” the card assured me. I agree: connecting the world is a collaborative activity. I doubt that technology alone is the answer, however. Technology has caused the social fabric to fray in a way that suggests in-person social connections in groups and institutions are more important than ever. Understanding and group identity is best fostered through live and online interactions, not by interactions through screens alone.

An abandoned stack of textbooks sat next to the computer. I read their titles silently: Life: The Science of Biology; Advanced Engineering Mathematics, third edition; Student Solutions Manual to Accompany Advanced Engineering Mathematics, third edition. There was something nice about seeing biology and math next to each other, even though the books had been left by an absent-minded student. It boded well that the student was thinking about both natural evolution and technology.

There were three computer labs off the lounge, each filled with dozens upon dozens of computers. All of the lounge students ignored the ENIAC. Some were working on a Python problem set. Another few were discussing whether they’d studied for the MCAT yet. People wandered in with white plastic bags and fragrant foam containers: lunch from the food trucks parked outside. The students in the lab were being earnest and serious and adorable in the way of all college students. These students were what the Model UN kids wanted to grow up to be in a few years. Kids are so great. I love working at a university. Universities are such hopeful, helpful places.

ENIAC looked diminished behind its glass wall. The original took up an entire room in the basement; this small set of panels represented only a fragment of its original massive bulk. A row of vacuum tubes lay on the ground. They looked like miniature lightbulbs: the old-timey kind with glowing, exposed wires that hipster bars in Brooklyn use.

I faced the main part of the display. Black wires hung down from the bottom of each, looping from plug to plug. There were so many plugs, so many knobs. In one panel, the cycling unit, a large white eyeball, seemed to stare blankly at me. Was this the readout? I realized why it looked familiar: this is the same camera eye that Clarke and Kubrick and Minsky gave to HAL 9000, the “sentient computer” in 2001: A Space Odyssey. The ENIAC eye is white; HAL’s is red. Red is much more menacing.

Black-and-white photos on the walls showed people operating the ENIAC in its original basement home. Eight men stood arranged in front of the computer, posing stiffly. In action shots, women in suits and sensible flats with immaculately coiffed hair turned knobs and plugged things in. I’ve only started to see these images in the past few years. Maybe they were always around and I never noticed them. Maybe there has been a conscious attempt to put women into the visual narrative of computer science. In any case, I like it. I also liked a casual shot of the women computers, six of them, piled on top of each other laughing. I liked that these girls were having fun together. It was a reminder that computing doesn’t have to be a male-dominated field. Many of the human computers of the 1940s and 1950s were women, but when the (mostly male) developers made the decision to push forward with digital computers, the women’s jobs disappeared. As computing became a highly paid profession, women were also edged out. It was the result of deliberate choices. People chose to obscure the role of women in early computing and chose to exclude women from the workforce. We can change that starting now.

I thought about the distance between the ENIAC and the computers in the Windows PC and Linux PC labs. These machines represent so much human effort, so much ingenuity. I have enormous respect for the history of science and technology. What computers get wrong, though, they get wrong because they’re created by humans in particular social and historical contexts. Technologists have particular disciplinary priorities that guide their decisions about developing algorithms for decision making. Often, those priorities lead them to obscure the role of human beings in creating technological systems or training data; worse, it leads them to ignore the consequences of automation on the workplace.

Looking at ENIAC, it seems absurd to imagine that this clunky chunk of metal would solve all the world’s problems. Yet as ENIAC has become smaller and more powerful and we now carry it in our pockets, it’s become easier to imagine things about it and project our fantasies onto it. That needs to stop. Turning real life into math is a marvelous magic trick, but too often the inconveniently human part of the equation gets pushed to the side. Humans are not now, nor have they ever been, inconvenient. Humans are the point. Humans are the beings all this technology is supposed to serve. And not just a small subset of humans, either—we should all be included in, and all benefit from, the development and application of technology.

Notes