20 Encryption as a Human Right

David Casacuberta and Adolfo Plasencia

10428_020_fig_001.jpg

David Casacuberta. Photograph by Adolfo Plasencia.

The right to privacy or intimacy is not the right to have secrets; it’s the right to decide whom you want to share information with. It's a basic thing. … It has always been part of the definition of human beings and their dignity.

The right to electronic privacy should also be part of the declaration of universal rights and democratic constitutions.

—David Casacuberta

David Casacuberta is Associate Professor of Philosophy of Science, the Universidad Autònoma de Barcelona (Spain). He earned a PhD in philosophy and a master’s degree in cognitive sciences and language. His current research focuses on the cognitive and social impact of new media, especially with respect to privacy issues, e-learning, and social inclusion.

He co-edited (with Jordi Vallverdu) the Handbook of Research on Synthetic Emotions and Sociable Robotics: New Applications in Affective Computing and Artificial Intelligence (IGI Global, 2009) and is author of La mente humana: Cinco enigmas y cien preguntas (Océano, 2001), Que es una emocion? (Crítica, 2000), and Creación e Colectiva. En Internet el creador es el público (Gedisa, 2003).

Adolfo Plasencia:

David, you are a professor, a humanist, a scientist, a researcher. How do you like being introduced?

David Casacuberta:

As a professor. “Researcher and philosopher” is a bit too much. Today we all have to be researchers and publish papers in big journals as a duty, which is a bit absurd in the humanities.

A.P.:

One of the things that brought me here to talk to you is a text you wrote, “The Right to Encrypt.”1 In it you say that, according to Guibbard’s 1992 human rights model, rights must be based on basic standards for human beings. After digitization and the deployment of the cyberspace, new technical and cultural contexts have emerged for those rights.

You wrote that a human right cannot exclusively stem from technical reasons., What would the nature be of the human rights that are now needed in a world ruled by digitization and coding, as Lawrence Lessig says,2 and in the global Web or cyberspace in which we also live?

Should we adapt the rights we had or reformulate them?

Should we create new human rights to protect the new areas of human relationship brought along by technology, which did not exist before?

Should we enlarge the Universal Declaration of Human Rights to include new rights?

Maybe we have to create global human rights, which are different because human rights based on local laws are not suitable for certain purposes.

What do you think?

D.C.:

All this is at the very basis of the big problems we now have. First, what we should never, ever do is create dominant rules or standards, or a dominant policy. The rights of citizens should not be reduced. And yet not everyone seems to think this way. More and more people ask me why we keep on and on about privacy. They say it is an “old” issue, that privacy no longer makes sense because you are going to be spied on anyway. You can’t help it. The NSA spies on you, they can use microphones, they read your email. People say, what’s the point of privacy today? It’s not worth discussing.

That’s why I quoted Guibbard and the idea that behind human rights there must be laws, mental states, opinions, and emotions in humans. So if we want privacy, stay calm. If we don’t want strangers to know things about us, that should be respected, because it is in our nature as human beings.

I do worry about something I keep hearing: young people who come and tell you, “What’s wrong with Google knowing things about you and recommending things to you via ‘your’ gmail address? It’s quite a good thing.” Well, I don’t think so!

I think we should all have our “vital” rights preserved, rights that are vital to us for emotional reasons, because we are human, we are mammals, and we work that way, and that needs to be respected.

Something is clear: with the advent of digital technologies, these rights are threatened by new problems that require new solutions. I think the simplest solution in the long run is to create new rights; that is why I wrote “The Right to Encrypt.” It is a twenty-first-century right. I think it didn’t make much sense to a lot of ordinary people even in the last century, in the 1980s. But things have changed dramatically. Today, the right to decide whom I allow to listen to my conversations makes more and more sense, because basically it is a right we have in the physical world. If I’m at home and I do not want anybody to see me, I draw the curtains. That way I feel safe. So I decide the level of information people can have about me.

A.P.:

When we talk about human rights, what is actually enforced is a law passed by a local parliament of a given country. Regulating human rights is a competency that has been transferred to each parliament, for each democratic country to pass such regulations and include them in its legal system. But this is not the best way to approach the problem, and the logic of the digital, immaterial world of the net is an absolutely global phenomenon.

I do not know if the scope of these new rights you’re talking about should be directly global. It would be very complicated for 190 countries to agree on new human rights to be recognized by the UN and applied by their parliaments.

Should we start thinking about global citizenship and a scope in which these rights can work and have a global nature?

D.C.:

Yes. Absolutely, the view on these things must be comprehensive. I even think this should go beyond formal laws. We should consider some ethical foundations on which global agreement could be reached, as with the Universal Declaration of Human Rights.3 I think it is essential to bring this up. Otherwise you see things like the NSA and the way it works. And things like “I can’t spy on American citizens because they are protected by the American Constitution, but I can spy on the rest of the world because they are not.”

A.P.:

Well, perhaps behind that logic, namely, “American citizens should not be spied on because they are a genuine democracy,” is an attempt to ensure good legal protection for some governmental leaders.

D.C.:

Of course. We can see it right from the beginning of the Internet, the need for other types of legislation. We now talk about privacy, but the same happens with copyright violation or classic rights breaches as regards theft, but with digital media. What happens when you have a criminal who is stealing credit cards and using a computer on Bouvet Island or near Lake Baikal but controls it from another computer on Fiji, and the bank responsible for those cards is based in France, for example?4 It gives way to such absurd problems that the whole thing needs to be reviewed. But no one seems to be willing to do so.

A.P.:

In “The Right to Encrypt” you refer to a concept by David Brin. In his 1998 book he said we were heading toward becoming a transparent society.5 The transparent society is now a matter of fact. All our intellectual actions involving electronic media are recorded in all kinds of digital devices. Cameras record our images, Internet servers have a duplicate of our messages, email accounts are a source of information, and the same is true for everything we do with our cell phones.

Any digital instrument of this type is available to the network system, along with their locations and space-time coordinates. This information makes it possible to track users wherever they are, anytime, anywhere.

Do you think all this is inevitable? Some of the creators of the Internet thought that, in order to avoid the potential dangers they foresaw at the time, Internet access had to be anonymous. Tim Berners-Lee always argued that access had to be anonymous whenever possible. Presumed security advocates seem to thinks this is almost a heresy.

Do you agree with this idea of the creators of the Internet? Should it be public domain, and much closer to openness and freedom than to security?

What do you think?

D.C.:

I totally agree with them. It should always be open. But I am pessimistic. I don’t think we will succeed, but it should be like that, absolutely. Cryptography can be a central element in that respect.

What finally allows you to anonymously gain access to the World Wide Web, and send messages without anybody reading them, is encryption. I am absolutely certain about that. But there are so many interests. … Lawrence Lessig has already talked about this in his book, Code and Other Laws of Cyberspace.6

But piracy, terrorism, pedophilia, and so forth are increasingly used as excuses to try and turn the Internet into a space with more control, recordings, spying, surveillance … don’t you think so?

A.P.:

Cryptography is not new. Generals in Roman legions already wrote messages on the scalp of slaves, waited for their hair to grow, and then sent them to another location, where they had their hair cut so the message could be read. So apart from Caesar’s cipher and other codes, there were many kinds of encryption.7 And two thousand years ago, the Chinese general Sun-Tszu in The Art of War spoke about six different kinds of spies, so this is also part of the history of humankind.

But I think Brin’s transparent society is not egalitarian. We are not all equal when it comes down to technological capabilities. We do not all have the same power. Some have an advantage; they have the capabilities and the means for very sophisticated electronic espionage. In other words, it is a “transparent” society only for those who are behind the mirror, as in the interrogation room, for those who are behind the glass. They run at an advantage. They watch without being watched. Am I right?

Besides, we take it for granted that everybody spies on everybody, but the question is, how much of this is transparent, and for whom in society? And in fact, this transparent society has multiple levels of transparency: the level of those in front or behind the mirror varies. In other words, it is transparent only for those who have the right means.

What do you think about this sort of security cynicism in governments in democratic countries? And what do you think about differences between the helpless and those who have all the advantages in the transparent society?

D.C.:

Well, this is the way things are. Technological changes have given way to enormous cultural changes, especially in recent times.

When Brin wrote about this, and when I wrote “The Right to Encrypt,” there were not yet such huge inequalities technology-wise. There were CCTV cameras in the streets, and very few people had video cameras. We seem to have forgotten all this. At the time it was very clear that technology was more or less the same for almost everybody. In that period, HTML was the same for everyone, so I was able to have a website as cool as that of the CIA, for example. The only thing it had was links, photos. …

A.P.:

But we can already see that in a sentence paraphrased from Thomas Jefferson’s Declaration of Independence, “All bits are created equal. It’s not just a good idea. It Ought to be the LAW,” has been applied to the digital world and the Web, where it is used symbolically in the defense of net neutrality.8

D.C.:

At the beginning, yes, but what is happening now? The Web 2.0, according to Bruce Sterling in his speech on the subject,9 has apparently facilitated publishing, but that sort of hides the fact that you no longer have direct access to the code behind it. You work on programs that were done by other people, and they are in control.

Before Google, when people searched for information, we were all the same, but now Google and others have information about us that the rest don’t have. The cell phone boom means that mobile phone companies have data and metadata on people’s location and actions; we don’t have access to that. So things have changed dramatically. Brin’s transparent society is no longer possible because there is clearly a monopoly on people’s private data, which was much better distributed before. In the past, one could be as effective as or more effective than the police or the government in obtaining and distributing information, but that has changed.

A.P.:

There is another subject in your texts: digital privacy. You speak about the need to redefine privacy. You have talked about the idea of “privacy as dignity.”

But privacy is now threatened by many electronic media. It is not guaranteed even by the laws of formal democracies. According to you, the threats are also “a threat or attack against our dignity, whether we know it or not, whether it affects us psychologically or not.”10 Is this so?

Can we, in practice, do something as free citizens from democratic countries against the attacks unveiled by Snowden, for example? Or is this far beyond the actions of a citizen from a formal democracy in the Western world?

D.C.:

The essential thing is not to give up that right as citizens. It worries me to see some trends not only among ordinary people but also among politicians and journalists. “Are you a terrorist? No. Then, what do you care if the NSA is listening?” It’s like a cultural stance. And that’s a mistake because, I insist, it is a matter of respect and dignity. If the NSA has a file about me, I don’t think they will find much, that’s true. But it doesn’t matter. I don’t want them to have it because it goes against a basic idea that I have about myself, about an autonomous person who has the basic right not to generate secrets. It would be a mistake to accept “If you are not a criminal, you do not have to keep secrets.”

The right to privacy or intimacy is not the right to have secrets; it’s the right to decide whom you want to share information with. It’s a basic thing. … It has always been part of the definition of human beings and their dignity.

Then we must stand up for that right and not give it up, even if they make it easy for us. That’s where I see the next step that citizens will take. We are to think carefully about the famous sayings, “When something online is free, you’re not the customer, you’re the product,” or “Anyone who gives you something for free is trading with you.”11 And be aware of it.

A.P.:

As a matter of fact, the “free” economy entails some traps.12

D.C.:

Definitely.

A.P.:

Would you say that to Chris Anderson?

D.C.:

Yes, of course I would. I think that part is not conscious. All these companies have a series of businesses, and it is unclear whether all the implications of those businesses for us are reasonable. And so people easily post information on those free spaces, and then there is always a price to pay. I am not talking about the future. I am talking about recent political cases of people talking nonsense on Twitter. They had to resign. They did not understand the medium in this basic way.

A.P.:

In a real-time electronic reality, everything is instantaneous … we lose our midterm bearings and we don’t even analyze the consequences of what we do with digital technologies. And the consequences can be very serious. We’ve seen it in the public world, in politics, and in other fields. But when you take an action with an electronic or digital medium, even people who are educated—perhaps not digitally—do not grasp how serious the consequences of a small electronic act can be.

D.C.:

Exactly. I think that is basic, it needs to be learned. And I think that cyber-anthropologist danah boyd explains it very well with her short, powerful sentences:

We’re seeing an inversion of defaults when it comes to what’s public and what’s private. Historically, a conversation that you might have in the hallway is private by default, public through effort. It’s private because no one bothers to share what’s being said. The conversation may be made public if something worth spreading is said. Even though the conversation took place in a public setting, the conversation is private by default, public through effort. Conversely, when you engage online in equally public settings such as on someone’s Facebook Wall, the conversation is public by default, private through effort. You actually have to think about making something private because, by default, it is going to be accessible to a much broader audience.13

and we continue working as if we were in the physical world. Now, you go to a job interview and the first thing they do is Google you.

A.P.:

We were saying that we do things that you think nobody will see and it turns out that everybody can see them. You didn’t do your privacy settings on the Internet, and then you send something to someone you know and you are actually publicizing it in a football stadium.

D.C.:

Yes, you put it perfectly. On the Internet, “default” means that the whole world is watching you in the stadium, on a giant screen. You are telling the world quite a few things there. But we are at home with our curtains drawn, sitting in front of the computer. The door is closed. We think nobody can see us. It’s a habit we need to face. We have to change our culture about it completely. Besides, we share more and more about ourselves, and there are more and more databases, and more cybernetics handling them, and all that stuff. The more entities there are willing to pay for such data, the greater the risks.

A.P.:

You wrote that in the world of mass electronic surveillance, one is guilty until proven innocent or until your innocence cannot be proved by yourself or by others. This sentence of yours is significantly earlier than Edward Snowden’s revelations, so it was you who said it, not in connection with Wikileaks but before.

Were you already sure then that the situation would be as it is now?

Will the blame be put on those of us who cannot prove our innocence in the digital society?

D.C.:

Yes, because we saw it coming; we knew what things were making governments nervous at the time. In my experience as a “digital elder,” having seen how all this arose has given me more of a perspective. I don’t mean to say I’m smarter. I was there when it all started, and I saw how it developed. Obviously, governments did not understand a thing at the time. Back then, newspapers loved criminalizing the Internet. The Internet was a place full of child pornography, terrorism, information about bombs, and so forth. The headlines read: “The Web Threat.” That’s the way things were. A very few journalists were genuinely interested. The rest was witch hunting, pure McCarthyism.

A.P.:

Let’s turn to another aspect of your article, “The Right to Encrypt.” Let me remind you of some ideas and questions in relation to that central concept, “The space of digital communications or cyberspace should be considered a public space,” and another premise, “The right to electronic privacy should also be part of the declaration of universal rights and democratic constitutions.” This is more of a question than a premise, actually. Third, you defend the need to “establish cryptography as a right, since no ordinary ICT user has the power to protect their electronic communications other than with cryptography.”14

Did what recently happened to the NSA strengthen that view?

D.C.:

Totally. There is something which is very interesting, talking about history. … Originally governments, when they realized something was problematic, they would always prohibit it, right? That is what Clinton did with the Internet: he created a shield law to protect minors and avoid any kind of pornography on the Internet—basically, to eliminate it. Let me remind you that the definition of pornography in the United States is very strict. So we won that battle.

And what did governments learn in the first exercises of content control? That it was much better to pretend not to know and try to show that those contents did not exist than ban them. There’s that famous phrase in Time by John Gilmore: “The Net interprets censorship as damage and routes around it.”15 In other words, content control is seen by the Internet as censorship, censorship as damage, and it routes content around to continue offering content. Therefore, universities and individuals decided to give shelter to documents that were censored by governments. They didn’t care about organizations like ETA before or ISIS now; they don’t care whether they are terrorist organizations or not. They believed that freedom of expression was important, that it was above everything else. No government has tried, for a very long time now, to stop a radical video because they know it is counterproductive.

A.P.:

And what have we learned from all this in two and half decades with the Web?

D.C.:

That silence is the best strategy. It even happens with cryptography. When I wrote the article, “governments” wanted to ban cryptography. They preferred citizens not to have any access to cryptography. Then they realized it was pointless because people were still using it. In fact, we used it to protest. And now what do they do? They don’t take any notice. They don’t mention it. They ignore it. And so people now do not use cryptography as they did before. But, as you rightly say, the existence of spy programs such as PRISM and other programs shows that we should all have cryptography and encrypt our mail, have contents in hard drives encrypted, and so on, and should trust these mathematical tools rather than “governments” that are not really willing to do their job.

A.P.:

We still say that cryptography prevents enforcement authorities from spying on our electronic communications. You wrote that a few years ago and you still think so. You said, “With a PGP [Pretty Good Privacy]-encrypted message, a police officer or an intelligence agent can only shrug their shoulders because there is no human way to decipher what it says.” But some “powers” also have other methods. For example, some time ago, four companies offering fully encrypted email services, such as SilentCircle, had to close down because the method used by U.S. agencies, with U.S. laws, was to enable a judge with a “secret court” and force these companies to either deliver the complete cryptographic key to decrypt all telecommunications with customers or close down. All four companies closed down.

Then is cryptography still impossible to decipher for these people?

D.C.:

PGP is, because it is based on this idea of autonomy. You generate the key and you are the only holder of that key. Besides, there is a whole community of hackers that makes sure that the program has no back door. But of course, very few people use PGP now. In that sense, it is secure, but only at a theoretical level; in practice we know what happens: the NSA has back doors to cryptographic systems and other computer tools, and they promote the manufacturing of computers with a back door (accessible to them, of course). So in theory yes, in practice no.

A.P.:

Then do you think that the alibi of collective security versus individual freedom to encrypt will finally prevail in the near future?—because the right to encrypt is an individual right and an individual act. However, the alibi of collective protection against possible terrorist attacks is in fact collective protection used by state agencies or governments to do what they do.

D.C.:

I’m afraid so.

A.P.:

As I was saying, this “collective security” alibi alleged by U.S. law is actually based on individual freedom. …

Will this alibi defeat our individual freedom to encrypt?

D.C.:

Yes, I think so. These are some common arguments used that are in line with the NSA: “Thanks to this program, we managed to arrest many terrorists; thanks to this program we managed to find Osama bin Laden. Thanks to this program, blah blah blah, blah blah blah.” They are very powerful arguments. Another science fiction writer has argued about this quite well and has reflected about this a lot. His name is Neal Stephenson. In one of his talks, he analyzed the idea of people’s changing nature.16 He said that humans are very fickle; depending on the situation, we have different models of rights. For example, if you live in a wealthy neighborhood, you don’t want security cameras and police walking around in the area, you love your privacy. But if you live in a neighborhood with frequent muggings and robberies, you want more cameras, more surveillance, and more police control. It is almost inevitable. Then it just takes a few alarms going off (even artificially) for people to accept it and say: “How many people die in a terrorist attack and how many die in a traffic accident?” We are culturally in that dilemma. Certain things generate alarm almost automatically. And if they are properly exploited, then people inevitably buy this collective security stuff. I think there is no easy solution.

A.P.:

Thank you very much, David. It has been a pleasure. We’ll keep in touch, and we will see if we can get further confused about this.

D.C.:

That’s unstoppable, I’m sure. Thanks to you; my pleasure.

Notes