In 2016, Nest, part of the Google family, was awarded a patent for a smart crib, embedded with sensors, that lets parents know what the baby is doing and what the baby needs or wants at that moment.1 It sends an alert to the parents’ phone if the room is too warm or too cold, and even responds to the baby’s mood with appropriate music or a cartoon screened on the ceiling. Now, patents frequently do not produce the goods, but this one is plausible, given popular interest in using the internet of things in pregnancy and child-rearing. In surveillance culture terms, it reflects beliefs about the efficacy of digital monitoring of children and the growth of practices involving parental reliance on embedded devices that alert parents to children’s conditions or activities.
Critics of many stripes may well worry about such devices, and serious warnings are already available. Tama Leaver, for instance, observes that such ‘intimate surveillance’ is in line with other already available products and potentials for parental monitoring and also for mediating information about their children online. He cautions that the use of such systems could normalize the idea that intimate surveillance is a necessary part of caring for children today, and that parents not taking advantage of smart cribs or platforms for sharing information about their children online could be seen as failing in their responsibilities.2 The novel opportunities offered by digital technologies may be attractive to some, but they may also come with a negatively normalizing price tag.
The smart crib exists at present only as a patent. The smartphone, however, is an everyday reality. The smartphone is an iconic symbol of the present era of digital communication. It began with the Blackberry’s mobile email device in 1999, to which voice calling was added in 2002. The meteoric rise of the iPhone after its launch in 2007 has set the pace since. However, that ‘since’ is only a decade to the time of writing, 2017, ten years of massive competition between Apple’s iOS and Android devices, where Samsung is the biggest player. The novelty lies in cleverly combining desirable features into one pocket phone. The normalization that occurs lies in their use becoming commonplace, taken for granted, within those few short years and in the fact that their surveillance aspects – whether for watching or being watched – are so seldom seriously considered.
I sometimes feel a bit foolish when I think of a phone as something you use to call others, to hear the voice of the other and to, well, chat. Of course, they are still used for that purpose, on which I was raised with landlines, but it is no longer their primary use. Texting, taking and sending pictures, finding one’s way in a strange city, even tuning your guitar – any of these and more is what phones are now used for. Old telephones could be tapped by police or intelligence agencies or, more prosaically, operators with their plug-in switches could not only connect you but listen in to the call – these were the limits of audio-surveillance by phone.
Today, smartphones are central to the emerging new attitudes and activities that I have dubbed the culture of surveillance. They have a captivating cachet with their addictive apps and at the same time are so ubiquitously familiar that not to have one is in some contexts to be marked as a curiosity. Basically, the smartphone is the embedded medium par excellence that connects users with data in everyday life. They are not just familiar, they are in many ways indispensable to contemporary life. They are used for many commercial transactions, including ticketing and online banking, as a way of being informed about breaking news, working out the ideal route for a trip, and checking what bodily symptoms might mean for personal health, among multiple other tasks. The commonest uses in the US, as in many other places, are for text, voice, internet and email.3
The explosion of awareness that smartphones are a treasure trove of data led to their uptake for predictive policing, security intelligence, consumer data analytics and the like. They are surveillant at several levels, in terms of the data they display. Just as ‘loyalty’ programmes that earn profits for businesses are promoted as ‘rewards’ systems, so what one might call these ‘personal tracking devices’ – PTDs, for short – are marketed as ‘smartphones’. At a basic level, they are ‘logjects’4 – objects with embedded software that monitors and records their own operation. In a so-called data-driven world this is highly significant because much of the users’ data is available to both telephone and internet companies, through data brokers to other companies and, under certain conditions, to government and policing agencies as well.
Social media data mining is explored more fully in the following chapter, but the factors propelling its development are important here. Data on users and their online behaviour are critical whether for commerce or for policing, especially as more and more transactions, communications and social activities generally occur online. Targeted advertising is the commonplace consequence; spectacular disclosures about the NSA and other ‘Five Eyes’ members is another. Each requires analysis, including explorations of how these apparently diverse spheres of government and commerce are linked with each other. And they are sometimes linked together, as householder Pauline Cook discovered when she asked ‘Alexa’, Amazon’s Echo digital assistant, if she was connected to the CIA. Instead of answering, Alexa simply shut down – twice. It became evident that the connections are indeed strong and serious.5 ‘Alexa’ could not deny that Amazon and the CIA have connections.
To grasp what is going on, however, one has to move beyond the level of alarmist approaches and paranoid predictions to examine the ways in which data accumulate, with the rapid proliferation of sensors in smartphones and many other contexts and the ways in which human behaviour is affected, for better or worse, by deliberate attempts to manipulate outcomes. As Sara Degli Esposti cautions, the really important issues are how certain approaches to data analysis become standard, what assumptions are implicit in these data management systems about human beings as persons, and how human identities and mutual relationships are affected by these developments.6 This also stretches to the political level, discussed later in the book, of data justice. Some things may be discovered by interrogating the systems, others by examining everyday life experiences that are at least partially shaped by them.
This chapter looks at ways that people have become culturally familiar with rapidly growing surveillance under the signs of mobile communications – typified by the smartphone – ‘smart cities’, the ‘internet of things’ and ‘wearables’ such as health and exercise devices like Fitbits. I look first at the example of facial recognition technology. Then, after seeing how fascination and then familiarity with the new are a longer-term phenomenon, we examine some explanatory factors such as invisibility, fascination, convenience – that can be understood in terms of surveillance imaginaries.
Beyond that, we examine some possible consequences: not noticing surveillance as such, becoming inured to surveillance, taking it for granted, going along with or even engaging in surveillance. These emerge in practices whose significance as surveillance starts to fade from view. This in turn has larger consequences: namely, the development of new modes of organizing everyday life where algorithms play an expanding role, and the clustering, classification and class divisions and other inequalities that result. In other words, our imaginaries and practices may contribute to the needless enlargement of large-scale surveillance as well as the emergence of user-generated surveillance.
Clearly, some surveillance technologies may be used in quite different contexts and user experiences of them might thus be very different. A case in point is facial recognition technology, FRT, which has been popularized online through its use on platforms such as Facebook but which may also be used by an officer in a police cruiser to check on someone who may be a suspect. Put like that, it is difficult to see how the two could be considered in the same frame, of surveillance. But what if using FRT in the one, social media, context made a difference to how it is viewed in another, policing, context? Might the playfulness of one context produce a downplaying of its seriousness in the other? Could the novelty of one lead to normalization of the other?
This is explored by Ariane Ellerbrok, who asks under what conditions previously controversial or unacceptable technologies might be made palatable, or even embraced?7 She responds by highlighting the role of play in accounting for the social life of surveillance technology. After all, play has a marketing logic as well as being an important cultural practice. Could play help to ease the way for controversial technologies that may have very serious implications? If so, this puts a twist into Johan Huizinga’s classic analysis of play, which he saw as something different from everyday life, and from profit-making or politics: a separate world.8 In the case of FRT, the play, the game, becomes a part of everyday life yet serves to support wider commercial and governmental goals.
FRT is a biometric technology that uses algorithms to match unknown faces with previously existing images of known persons in a database. It is used by police and security agencies and was widely marketed after 9/11 as a key tool in anti-terrorism initiatives.9 But as Ellerbrok observes, FRT has more recently become available in consumer contexts for photo-networking and organizing personal digital images. It is best known in Facebook tagging. In this context, pleasure, convenience and entertainment are much more in view.
FRT is controversial in its original military, policing and security contexts, not least because there is evidence of its use in subordinating certain groups along racial, ethnic and class lines. It is also viewed by some as being redolent with both criminality and with state control of a ‘hard’ kind. But when Google Picasa and iPhoto Faces started offering FRT opportunities to customers, it seemed to have few if any threatening or negative features. On the contrary, it was convenient and facilitated fun – it was even ‘feminized’, suggests Ellerbrok, in contrast with its original ‘masculinized’ uses.10
Ellerbrok also makes the telling point that the photo ‘tagging’ now used extensively on Facebook and other platforms seems to have no associations among users with electronic ‘tagging’ of offenders as a means of criminal identification and tracking. Indeed, the potential for errors in photo tagging is seen as a source of humour – they are funny as well as fun. In this case, even mistakes are entertaining, quite beyond the convenience of photo-organizing programs. FRT may be associated in this case with lightheartedness, escapism and even fantasy.
The errors made in tagging may amuse, but in fact Facebook’s recognition rate is high because each time a face is viewed, say, from different angles, the machine learning program improves recognition efficiency. An agency such as the US Federal Bureau of Investigation (FBI), on the other hand, has only a limited number of images, often taken face-on, like a driver’s licence photo. Their recognition rate is, understandably, lower than Facebook’s. Some have objected to the facial recognition dimension of some apps such that, for example, Facebook’s ‘Moments’ was launched in Canada and Europe in 2016 without FRT.11 It is far from uncontroversial.12
It may be a short step from tagging friends on Facebook or using iPhoto’s recognition system to seeing the technologies themselves, in this case FRT, as relating to play and from there to the role of play in obfuscating, normalizing or marginalizing the technologies. ‘Soft’ FRT may appear far less ominous than ‘hard’ and thus contribute to a certain dissociation of FRT from the hard contexts, says Ellerbrok. Play may also help to foster normalization by making FRT more acceptable through photo-sharing and the like. Lastly, play may also permit the marginalization of certain groups as games involving ‘spying’ offer pleasure from placing members of such groups in a subordinate position to the user.13
Of course, the serious and the playful are mixed messily in practice – games may be played in serious contexts and play may also take a more sombre tone in others. Play could well have a role in legitimating once controversial technologies, a situation that appears again in the following chapter. Playfulness lends weight again to the notion that surveillance imaginaries and practices are implicated in how surveillance of certain kinds may have a fascination through association and then become acceptable where once they may have attracted scepticism or disapproval.
More needs to be done to grasp how play relates to surveillance. This requires close, ethnographic studies of everyday social situations. But the Ellerbrok proposal is plausible, that the use of facial recognition in the context of playful checking of photos could reduce the seriousness of issues raised by FRT both in this and in other – policing and security – contexts. In this case, the enjoyable surveillance practices associated with tagging and related apps may help to inform surveillance imaginaries in ways that yield a more sanguine attitude towards FRT in other contexts. There is, however, some evidence from large-scale studies about how the public’s attitudes to surveillance might change. This is examined later in the chapter.
Whether it is the smartphone itself or some function or app that allows for checking the identity of someone in a photo, these are historically unprecedented possibilities. The devices and what can be done with them offer new potentials. Novelty attracts attention and may even be expressed as neophilia, a term popular with some hackers for the ‘love of the new’.14 It may be seen in the way that smartphone users may queue up for a new version of each smartphone. Over time, people may become more familiar with what once seemed amazing, normalized to possibilities – including the ongoing potentials for surveillance – available using such devices.
But fascination with the new is not itself new. One wonders, for example, if today’s fascination with so-called driverless cars is as big as was awe around horseless carriages a century ago. Even Brave New World writer Aldous Huxley seems to have been thrilled with the latter when he noted in the 1930s that the ‘drug of speed provides the one genuinely modern pleasure’.15 Are there digital equivalents to Huxley’s apparently avid attraction to the ‘drug of speed’?
Just as the horseless carriage took time to develop, so too the shift towards ‘assisted’ vehicles has been in process for a long time – at least since power-assisted brakes or steering. The actual slow progress does not stop enthusiasts such as Tesla chief executive Elon Musk from claiming that the ‘Autopilot’ function is ‘almost twice as good as a person’.16 Many seem to be intrigued about these possibilities, especially if they already have features in their car that hint at further automation to come. Such openness to the new plays a role in informing imaginaries and, because of that, practices – which, as we have argued, then become carriers of the imaginaries, sometimes in a widening spiral.
What is not so often considered is the way that the assistance, in vehicles as elsewhere, comes primarily from computer systems that have steadily been colonizing cars and trucks for many years, and what this might mean for surveillance. Because such systems are constantly recording and reporting where the machine is, how it is operating, as well as what the user might be doing, they are highly surveillant in practice. But they are surveillant in ways that differ from earlier forms of surveillance – they depend on the continuous collection of data through sensors.17
Autonomous vehicles are in the news, and the safety of passengers has been a recurring theme in discussions of their likely impacts. This is appropriate, of course, and must remain a priority for makers of self-driving cars. However, the computer systems that assist such vehicles, rendering them increasingly autonomous, have many dimensions. A key aspect of this is how instructions are given to the car, not to mention how the car communicates with the user. In order to work well, much data have to be entered, in an ongoing way, about potential routes, stopping places – for coffee or other errands – time, pick-ups and the like.
Those data necessarily reveal much about so-called self-driving cars and their users, which it may be possible to keep ‘self-contained’ inside the vehicle, as with the experimental Google car. Such vehicles contain a mother lode of personal data on their users, such as where they travelled, for how long, at what speeds, where they stopped. However, one suspects that a more likely scenario is one involving interconnected vehicles in a road-user network, where information is shared in order to keep the whole system running smoothly.18 Information may circulate about other users, road conditions, weather, along with GPS location data. Indeed, these data may be advertised as contributing to the enhanced safety of the vehicle. In the US, the Department of Transportation already has a Connected Vehicle Program to coordinate existing forms of assisted driving in preparation for their further development.
The autonomous car may seem like a dream for harried drivers, but the draw, convenience, also comes with possible drawbacks, including unwanted surveillance. This is nothing new, of course. Precedents were set with the Uber car-hailing service, an app that allows users to hail a ride remotely, but also collects tremendous amounts of data with the familiar rationale – to improve the service for users. For instance, in 2015 Uber initiated a new data-gathering technique that involved continuing to collect information on passengers’ whereabouts even after their ride was over. This prompted a complaint in the US from the Electronic Privacy Information Center (EPIC) to the Federal Trade Commission, on the grounds that it ‘far exceeds what customers expect from a transportation service’.19
This raises questions about how some are drawn to new technologies, from the digital computer to the internet of things, with a sense of wonder and the thrill of the new. And how these things soon become familiar, taken for granted, especially when they are invisible and do not fit traditional definitions of surveillance and privacy. In Orwell’s twentieth century, a vehicle deemed to be surveillant would have been deliberately bugged. In the twenty-first century, it is commonplace for vehicles to be surveillant in the sense that they record their own use, and those data are available for scrutiny, even before autonomous vehicles are common. It would be easy to become blasé, assuming that new, self-driving cars are safer, part of today’s world and therefore basically beneficial or at least neutral. In the case of smartphones, clearly, many enjoy the convenience, the contact with others, even the aesthetics and feel of the devices. Desire for devices is significant. They offer status, approval, enjoyment.
Such sentiments are also true of the main player in this drama, the smartphone. The ways in which users are fascinated with their digital tools vary considerably, but Apple aesthetics give some interesting clues. No Silicon Valley company has managed to achieve quite the status of Apple when it comes to design. It appeals to what began as 1960s countercultural values that are both stylish and empowering. Apple was non-conforming and clearly aspired to the imaginative and even to liberation. What the iPad or iPhone can do is important, but the medium, the device, is itself designed to delight.
The lure of the seductive surfaces is acknowledged, for example, by Guardian writer Jonathan Jones, who admits that the iPad is not the easiest machine for writing – yet that is how he uses it. He admits that ‘I am captivated by the beauty of this piece of technology’.20 As he says, it was the ‘soft machine’ aesthetic that succeeded from the start. At a time when science fiction was foreboding, worried about a dark, inhuman future of disembodied users staring into screens, Apple offered what seemed to be a simple, human alternative, machines that accompany us in daily life, to whatever location we wish – on public transport, in the café, in the street.
While computing machinery has been seen as sexy and attractive and has been given personal names for a long time, this has been particularly true of the iPhone era. They are seen as objects of desire, even, says Deborah Lupton, as ‘edible’ or ‘delicious’.21 They may also be thought of as being intimately attached to their owners, as extensions or prosthetics for the body, associated with personhood. Users may even think of themselves in cyborg-like terms, particularly when the iPhone is unlocked with a fingerprint or with facial recognition rather than a password.
Interestingly, British psychological researchers found that the very choice of phone may be a predictor of some traits.22 While iPhone and Android devices each have roughly half the market, it is younger people, more frequently female, who own the former, seeing it as a status object and not caring that so many others use the same device. More men than women use androids, however, and they appear to be less extroverted and less likely to seek personal gain. The researchers point out – of course – that from their findings one could predict who might buy an iPhone or an Android device, but also that our devices may become more like us, leading to unease when someone else uses our phone. How far your image of yourself matches the profile created by those with access to your data is a question of imaginaries meeting analytics and what happens to each when they do.
At the same time, the sheer convenience of computer machines is also part of their fascination, and how they become so familiar. As a Pew study of so-called millennials found, this first ‘always connected’ generation depends on convenience: ‘Steeped in digital technology and social media, they treat their multi-tasking hand-held gadgets almost like a body part – for better and worse. More than eight-in-ten say they sleep with a cellphone glowing by the bed, poised to disgorge texts, phone calls, emails, songs, news, videos, games and wake-up jingles.’23
But this is about more than being excited about new technologies. It is also about how so many new technologies are embedded as sensors in the routines of everyday life and how inconspicuous they are. Once, sociologists spoke of ‘cultural lag’,24 meaning that cultures take time to catch up with technological change and that some dislocation or distress may occur in the process. The notion does still have some resonance with what is happening today. Except that with the consumerist pressures to embrace the new, the idea of cultural lag now seems more complicated. And, of course, the responses to the new technologies vary depending on factors such as age, gender, race and class. Not everyone lines up for the newest phone; many users are actually quite ambivalent about them, even before questions of surveillance are raised. In the Pew survey of smartphone use, while 46 per cent of American users polled said they ‘couldn’t live without’ their smartphones, 54 per cent agreed that they were ‘not always needed’.25
But more is going on than just some acclimatizing to the new. The context itself is shifting. While the familiar may become ‘normal’, it also happens in this case because surveillance institutions are changing – notably with consumer monitoring and tracking and the growth of the data industry – and this in turn may portend a larger-scale shift. Building on past developments, dominant cultural expectations now incline many populations to individualism, and towards self-exposure and self-promotion, topics considered in the next chapter. Here we are concerned with issues of how new technologies are often seen as boons to convenience or security, but how they are also imperceptibly surveillant. It may turn out that fears of potentially damaging surveillance in vehicle computer systems, the subtly normalizing effects of features such as facial recognition on Apple or Google devices, or worries about how users’ own surveillance may negatively affect others play little role in surveillance imaginaries and practices. Ignorance or unconcern may be just as important.
But these features of digital modernity are still worth considering and studying carefully. Given the substantial political-economic role of these new devices and systems, ignorance and unconcern may well have negative consequences, not only for individual users but for wider issues such as democratic participation and human flourishing itself. So the focus requires sharpening.
We turn next to reflect on how far this may portend the rise, not only of those much-hyped smart cities, the internet of things, and ‘wearables’ carried on your own body, but of a new way of organizing consumption and of modelling business. It may one day be widely seen as knowing capitalism,26 or perhaps better for our purposes, surveillance capitalism.27 This view, too, deserves more than a passing thought.
A defining feature of surveillance culture is the state of technology. The use of interactive and smart technologies shifts the focus from fixed to fluid surveillance, from hardware to software. Data from the smart electricity meter show whether or not you are at home. Your smartphone logs your location and your ‘likes’ as well as whom you contact. But this occurs within a wider cultural context in which gauging risk and opportunity are central, anticipating the future is a key goal, and of course where economic prosperity and state security are locked in a mutual embrace. The result? Smart surveillance and social sorting go hand in glove. And together, they inform and inspire surveillance imaginaries and practices, which in turn help to enable or constrain the further development of smart surveillance.
But what sorts of imaginaries and practices emerge from the increasing ubiquity of computer-based things? Having a sense of the cultural meanings of things like the smartphone and anticipating futures are contributors to surveillance culture. They are undoubtedly modern, ‘rational’. But what of the ways in which such things become part of the Lebenswelt or lifeworld? How do they affect surveillance imaginaries and our practices? This is a much less modern-rationalist kind of question. It relates to a shift that occurs when we think less of these computer-things as ‘tools’ – where function and efficiency to ‘users’ are paramount – and more as ‘presence’, where the specific user and context are crucial.
The smartphone is an ideal example of a technology as ‘presence’. Things like this enter our lives; some people sleep with their devices by their beds, showing that they have fully accepted, welcomed and domesticated them. They do not simply exist, out there. For some, they are like prostheses, artificial body parts, on which the user depends and which the user treats seamlessly as a taken-for-granted and necessary part of the body.28 Philosophically, we are now thinking in a more phenomenological fashion about how people ‘dwell’ with computing machines, rather than merely ‘interact’ with them.29 We are asking what they mean to us, not just what they ‘do’.
Embedded, wearable and mobile technologies slip easily into the routines and regimes of everyday life. They are purchased by people for whom they often offer seductive, convenient benefits, including personal enhancements. Most obviously, with mobile and then smartphones, the device becomes a part of life, a personal object not just a communication tool. But more generally, as ubiquitous computing and the internet of things develop, both designers and ‘users’ become more aware of the need for appropriate ‘interfaces’ that diminish the ‘distance’ between users and their machines. Hence, for instance, the items of clothing that contain sensors that are otherwise in personal tracking devices like Fitbits.
One good question is, how do users start to use the device or system? How is it ‘invited’ into daily lives, consciously or not? It depends in part on how it is presented, or presents itself, to potential users. Is it an appliance with functions, or does it represent a particular expression? When people see others’ things, especially their devices, and what they do with them or to them, they get a sense of what they mean to them and how they are part of their lives. Albert Borgmann contrasts the ‘device paradigm of modernity’ that works by commodifying wants and desires with what he calls ‘focal things’ that ask for attention or involvement; they are engaging centres in human practices.30
In these ways, aesthetics become important in considering smart technologies of surveillance. What the machine expresses is significant. This depends on its location in time and space much more than the mere technical specifications that it possesses and which the retailer will play up. The machine may be built to be fast but in the world of ‘dwelling’ it is not necessarily so. Its ‘presence’ is likely to be long term. Users may even develop ways of using phones that are distinctively theirs; they have been invited and welcomed into their lives in particular ways. Of course, the machine may then become, paradoxically, no longer unique, just another building block of users’ lives, like things made of wood or metal. In this sense, it is not only embedded sensors that ‘disappear’ but also the thing itself, the computer.
Embedded sensors and the computer-communication systems that support them are also (dis)appearing in urban environments. They often do so under the banner of ‘smart cities’, which are described as built environments – both new and retrofitted – integrated by such computer-communications systems along with the internet of things. Enthusiasts think of them as intelligent platforms for the life of the entire city. And there are, doubtless, benefits for citizens – note the connection between this word and city life – to be gleaned from more creative uses of available data. However, critics worry that the ancient and complex task of city building cannot be reduced to computation.31 Data are always collected in particular contexts and all such cities are bound to be surveillant, whether in obvious ways using street cameras or drones, or through less than obvious data centres and smart meters.
In such contexts, it is easy to see how smart cities could become incubators for surveillance culture. If you live in an urban area described as smart, then the informational infrastructure will be taken for granted and its novel features will be normalized. The disappearance of more familiar markers of surveillance activity will be matched by the seeming ordinariness of monitoring and calculating the ‘optimized’ city. Watching and being watched are hard-wired into the smart city. To explain what I mean, we may travel to the long-promised green-field smart city of Songdo.
This new, built-from-scratch city nearing completion near Seoul, South Korea, is intended to be a high-tech and international business hub for North East Asia.32 Some proposals that have been made for Songdo are to fit public recycling bins with RFID sensors so that people are credited when they throw in their bottles, and to provide pressure-sensitive floors that alert emergency services when older people experience falls.33 Smartphones, with early popularity in South Korea, store health records and may be used to pay for prescriptions.34 Children may be tracked – for their safety – by microchips embedded in bracelets35 and no doubt, when they happen, smart cribs will be available for younger siblings.
This is the world of ubiquitous computing – indeed, Songdo is described by promoters as a ubiquitous city. However, bottles, floors, cellphones and even children’s safety bracelets are all very tangible, concrete phenomena, and it is easy to be distracted into thinking only about them. Ubiquitous computing environments also depend on embedded sensors, mobile technologies and, crucially, relational databases. These may be less visible but they are part of what makes ‘ubicomp’ work.
Take relational databases. These connect the bottle with the consumer, the elderly person with emergency services and, through the cellphone, the patient with the prescription. The database can very quickly sort through a range of digital tables to make the appropriate links. Indeed, such systems sense the environment, create a context for the information, communicate internally among the relevant components, draw inferences from the available data and draw some conclusions.36
Behind the relational databases, of course, are management decisions – of business, policing, security, healthcare or whatever – that guide the ‘conclusions’ arrived at by ‘intelligent systems’ in smart cities. As Shannon Mattern notes, smart city pundits depend on calculating the ‘effectiveness’ of cities using key performance indicators – clearly a very business-driven approach.37 But what, for instance, if the consumer, elderly person and patient were one and the same, and access to healthcare was dependent on evidence of moderate alcohol intake, as judged by which bottles were tossed in the bin? How simple and subtle surveillance could be, and how innocently associated with smart business acumen.
Without the hype of smart city utopianism, however, similar changes are occurring elsewhere. There is a push to ‘informate’ cities and to offer new services and benefits from electronic networks connecting previously separate functions into large-scale systems. The shift to online communications and to dependence on tracking and sensors has been a steady one and is felt every time one obtains a medical prescription, tries to contact city hall, or even attempts to park a car.
In the last case, the urban area of Santander, Spain, boasts wireless in-ground sensors that are supposed to reduce the stressful search for a parking spot and optimize the usable spots, thus also helping to reduce pollution.38 Such systems are always likely to be ambivalent. Living there, one may well be grateful for such a system while at the same time being made aware that the system, and perhaps even someone running it, must be aware of where you are when. Another component is added to the surveillance imaginary.
In many minds, such a question immediately connects ubicomp and sensors with surveillance. Rob Kitchin, for instance, notes that would-be smart cities have surveillance systems from smart cards to automated licence plate recognition cameras, ‘intelligent transportation systems’ to smart meters for electricity consumption. In addition, he says, ‘Urban places are also now full of objects and machines that are uniquely indexical that conduct automatic work and are part of the internet of things, communicating about their use and traceable if they are mobile.’39 They transfer data between themselves, producing more derived data.
Because of this, some believe that so far from being utopian – as in hyped depictions of Songdo or Silicon Valley-type dreams for American cities – RFID and other wireless technologies may well have dystopian aspects. Or, at least, smart cities may end up resembling state and corporate priorities – Songdo’s context of conduciveness to bicycle friendliness and urban farming notwithstanding. If everyone is observed, automatically and constantly, questions about surveillance deserve to be raised. These novel cities may foster the normalization of surveillance, to which ordinary urban dwellers contribute simply by communicating, commuting or caring for their elders.
The questions with which one is left have to do with the intensified scrutiny under which smart city dwellers live, and how this is seen and experienced by them. Will they accept the reassurances about safety and security, convenience and comfort? Or will they be brought into public debate as matters for discussion, contestation and community-based solutions that might be relevant and acceptable to all? As we build our case, we turn our attention from smart cities to another area where there is much promotion and few clear markers – wearable technologies for self-tracking. Here again, surveillance novelties may be subtly normalized.
Imagine. You show up for your first day at your new job, only to be informed that you will be wearing a device that will track your movements, your fitness and your health, perhaps even a body-worn camera or a biometric authenticator to check that it really is you who wishes to enter a secure area. The company will have access to the data you generate but you are not told a lot about how they will analyse and interpret it. Yet it is you who will be doing the surveillance. The employee is involved in collecting, analysing and sharing information on herself. Researchers at the Surveillance Studies Centre produced a 2017 report showing that workplaces widely promote the use of wearables.40
Wearable technologies such as Fitbits are rapidly becoming commonplace, at least in some countries of the global north. The ubiquitous computing and internet of things mentioned earlier touch not only vehicles and buildings but, with ‘wearables’, human bodies as well. They enable the body to carry electronics, software and sensors. Fitness devices or activity trackers are probably the most common, along with smart watches and smart clothing. They may be straightforward consumer goods or, increasingly, gadgets that are supported or even required for certain jobs or activities.
Wearables measure aspects of our behaviour, data that they also record, track and transmit. Fitness and wellness data are checked through the use of devices that are worn as bracelets and armlets or in clothing. However the actual electronic thing is configured, the point is that it depends on complex computer and communication technologies as well as on the user – who often imagines herself to be the chief beneficiary. In the case of wearables, it is clearly the user who is in view, regardless of who has access to the wellness, fitness, movement and other data.
However, another potential beneficiary – apart from the health and fitness corporations – is the employer. There is a market for wearables to enhance the productivity and safety of workers, both in more traditional heavy industries and in desk-based occupations. They may also assist people with disabilities or heighten the sensitivity of workers to their tasks. Of course, in workplace settings specific concerns may be expressed, such as that the employer may use the device for ‘spying’ on employees’ whereabouts or checking that breaks are limited to the agreed times.41
Those who use wearables for their own purposes may do so with a degree of seriousness that goes beyond earlier forms of keeping a journal or diary. Indeed, the Quantified Self movement that began a decade ago with meetings and conferences is now a term in more general cultural use. Although more will be said about self-surveillance in the next chapter, it is worth noting here that wearables take this to a high level. They include practices such as life-logging and self-tracking in a quest for self-improvement. By knowing more about oneself, one can monitor one’s health or one’s personal performance in many contexts.
Users of wearables display surveillance imaginaries that raise questions about the normal – ‘Am I doing this correctly?’ ‘Is my diet regime paying off as expected?’ ‘How am I performing in relation to my peers and my past?’ Their actual practices, then, follow from this; an assiduous wearing and checking of the devices to be sure that one’s performance is indeed up to or beyond par. It takes little imagination to see how this, too, may well have normalizing effects as the role of data is raised to a high level, perhaps eclipsing other ways of being aware of how well – or not – a user is doing.
Deborah Lupton describes these entanglements of humans and digital data as ‘lively data’.42 She does so for several reasons, among them that the data are always on the move, shifting, changing and subject to modification based on the goals of the end users and of the parent corporations. She has in mind the whole range of smart devices and contexts and describes the emerging data practices of such users. Again, because we are considering the digital, all these practices have surveillance implications, from the interpersonal through to the monitoring activities of corporate and government organizations.
One finding common to a number of research programmes is that while people may be aware of the apparent benefits of the circulation and analysis of data for national security or healthcare, they do not know what happens to data collected from them. They may even personally value the same kinds of data when self-monitoring or fitness tracking is involved but be less than sure about why it might circulate and with what effects, on them or on others.43
In this emerging world of ubiquitous computing, smart environments and always-on devices it is hardly surprising that few know what happens to personal data – if they can still be described thus – or what effects others’ use of it has on us or on others. The use of sensors, which is a large part of what makes cars, buildings or clothing smart, has to be considered in relation not only to how they respond to human presence or activity, but also to how they are programmed to function in the ways that they do. The hidden element in all this is the algorithms, the codes that guide the workings of the system.
If, as I am proposing, surveillance imaginaries and practices attend many commonplace devices, systems and situations today, it is important to find out what is being ‘normalized’ and how it becomes so. Digital lives are inevitably under surveillance and surveillant in some respects, but is it possible to know exactly how this works? Where does the definition of ‘normal’ come from, that becomes the rule by which users measure their adherence, their degree of congruence? In the digital world, it is often quipped that ‘there’s an app for that’. Behind the app, the wearable, the smart crib, city or phone are algorithms.
In his book The Black Box Society, Frank Pasquale homes in on reputation as a key area for considering what algorithms do. This refers to the reputations of businesses but also especially to the personal reputations of ordinary users of the internet. As he says, ‘In ever more settings, reputation is determined by secret algorithms processing inaccessible data.’44 While the internet companies and others who use algorithms claim that they are scientific and neutral, this is hard to substantiate. And when matters such as credit scoring and social ranking, not to mention decisions as to who gets onto no-fly lists, are achieved using those algorithms, it is at least clear that much hangs on how they are constructed.
There is little doubt that algorithms and indeed the Big Data practices that require them involve power relations. After all, they are used to identify potential terrorists as well as generating credit ratings or parole recommendations. In 2016, for instance, ProPublica, a non-profit, public interest newsroom, exposed how a correctional tool used for informing sentencing decisions overestimated the recidivism rate of black defendants.45 Thus, who has access to datasets, what Big Data correlations really mean, how data are obtained and what inequalities and injustices might be caused by the use of certain algorithms are all vital questions. These are the large context in which lives are organized today. While ordinary users may rely on various devices to keep track of personal progress, at home, at work, at play, such practices are also subject to wider scrutiny.
In the end, as John Cheney-Lippold avers, ‘we are data’.46 In other words, the systems and platforms described here view their users only in data terms. From the elderly person who might fall in their smart city building to the worker who is required to wear a tracker in the office or manufacturing plant, each is known to the system as data, organized and analysed by algorithms. The same is true, of course, of the more intimate-sounding dating sites or some ‘caring’ infant-monitoring systems. The potential lover or crying child is understood only as data.
That power relations may be discerned in those algorithms is, in this sense, beside the point. Their ability to control, to govern is not circumscribed by rights or legal requirements. Such ‘algorithmic governmentality’, as Antoinette Rouvroy explains it, ignores the embodied individuals it affects because its supposed subject is actually a ‘statistical body’. As she says, individuals ‘are considered as temporary aggregates of exploitable data at an industrial scale’.47 How does this work out in practice?
In the effort to understand contemporary forms of surveillance, particularly from the perspective of those who experience it, it is difficult not to conclude that some of what happens today is quite different from the kind of surveillance normally associated with the word. The fascination and familiarity with embedded sensors or body-worn trackers takes us to a world of rather voluntary, open and even participatory surveillance that goes beyond that of national security or policing surveillance, which is often secretive, covert and coercive.
All the same, in a time of liquid surveillance, digital data flows allow the former to mix and mingle with the latter. Predictive policing, for instance, draws upon just those ‘voluntary and open’ sources for data that will then be repurposed for surveillance of the more conventional kind. Such policing practices are becoming popular in Europe, North America and elsewhere. They depend on using algorithms for Big Data analysis to indicate geographic shifts of crime in a city, or to second-guess who is likely to reoffend.
A perennial preoccupation of sociologists has been to discover how societies are divided and why. In modern times the idea of social classes has been prominent. If Karl Marx saw social classes forming in relation to production – those who own the means of making money over against those who have nothing but their labour to sell and nothing but their chains to lose – then Max Weber, more subtly, conceived of class in relation not only to production but also to the marketplace, to purchasing power, status and the like. But by the later twentieth century, none of these categories seemed satisfactory. Certainly, the rich were getting richer and the poor, poorer, and both situations were increasingly globalized.
While some sociologists seemed to turn their backs on class analysis and to focus on other divisions, another process was quietly under way that had everything to do with classifying populations. Its outcomes were not a bit less portentous for people’s life-chances and opportunities than those explored by Marx and Weber. That process began to flourish in the 1990s with geodemographic marketing, which divided populations into clusters according to their purchasing and preferences, before morphing into an information-intensive system generally called ‘relationship marketing’. The earlier phase made much of postal and ZIP codes as a means of categorizing consumers – ‘birds of a feather flock together’, they observed – and this process persists to this day.
The Claritas Corporation, for example, divides cities into ‘urban uptown’, where you might find the ‘young digerati’ but also a ‘bohemian mix’, or the ‘urban core’ where ‘low-rise living’ – ‘a transient world for young, ethnically diverse singles and single parents’ – or ‘city roots’ – ‘lower income retirees’ – reside. This means, significantly, that such classifications are connected with place. As Claritas Corporation put it, back in the 1990s, ‘you are where you live’. By which was meant, data about you correspond with similar data about others living in the same neighbourhood.
Territory is emphasized as a means of differentiating between social groups. It helps to dissolve a common distinction, seen especially in Weber’s work, between class as economic situation or as material property, on the one hand, and status, as lifestyle choices and consumption, on the other. Although the hierarchies of classification, say, from ‘urban uptown’ to ‘city roots’, are clearly economic and relate to wealth, they are based on consumption patterns as the means of distinguishing between different social groups. The geodemographic analysts build on popular phrases that describe city zones, from ‘SUV-friendly’ to ‘granola belt’ to ‘gritty neighbourhood’. But they turn them into classifications that have very real effects on people’s opportunities.
Pierre Bourdieu wrote in the 1980s about the importance of such ‘distinctions’,48 which include an aesthetic dimension which subtly sorts differences between social groups, keeping the ‘lower orders’ in their place. The right to classify others bespeaks considerable power, according to Bourdieu. As Roger Burrows and Nick Gane observe, while Bourdieu thought mostly of classification as a decisively government activity, such classification is now, at least initially, very often in the hands of major corporations such as Choicepoint or Experian.49
However, it is worth noting how, with the spread of surveillance across different domains, but often using similar codes, both state and commercial entities are involved in the process.50 Either way, the geodemographic dimension connects such classifications with locations, places where people may feel they ‘belong’. And in an increasingly individualized world,51 such a sense of belonging is also experienced as a positive benefit. As Burrows and Gane say, where people (desire to) live, and especially where they consume, means a lot for who they are. People choose to group themselves with others with whom they believe they have features in common and this is reinforced by geodemographic and, today, social media marketing methods. Embodied individuals, in other words, began to interact with their data doubles, as an evolving aspect of surveillance practice.
Processes of social sorting are constantly in play. Some people are excluded altogether; some classified in negative ways, such that ‘personal and group data are used to classify people and populations according to varying criteria, to determine who should be targeted for special treatment, suspicion, eligibility, inclusion, access…’,52 are deeply involved in the somewhat ‘liquid’ social classes of the twenty-first century. As Joseph Turow observes, we have to move beyond thinking only that individuals can be identified and harmed using data. Corporations construct our reputations, which in turn determine what information, consumer deals and other surveillance attention we receive. These affect our opportunities, our view of ourselves and the world we are presented with.53 Our surveillance imaginaries and practices, in other words.
More than a decade ago, a Report on the Surveillance Society in the UK concluded that ‘surveillance varies in intensity both geographically and in relation to social class, ethnicity and gender’.54 ‘Individuals are seriously at a disadvantage in controlling the effects of surveillance’ because of wide differentials in access and influence. And, at the same time, ‘Individuals and groups find it difficult to discover what happens to their personal information, who handles it, when and for what purpose.’ But it is equally true, as argued here, that surveillance itself helps to create those classes in the first place.
That report noted that social sorting ‘affords different opportunities to different groups and often amounts to subtle and sometimes unintended ways of ordering societies, making policy without democratic debate’. Of course, every administrative system aims to sort through differences between different groups in the population in order to ensure, for example, that taxes are paid appropriately or that benefits are distributed according to agreed criteria. And in so doing, they contribute to orderly governance. By the twenty-first century, the growing dependence on risk management and the rapidly expanding use of information and communication technologies were driving significant changes in surveillance as social sorting.
While contemporary organizations have become dependent on surveillance as their key modus operandi, surveillance, in turn, is increasingly characterized by social sorting. A focus on risk and opportunity management underlies such social sorting, and the widespread use of new technologies and their associated statistical techniques facilitates it. With the decline of shared risks within state-sponsored welfare systems, for example, risk becomes increasingly an individual responsibility and the management of those risks becomes an industry in itself. In order to streamline and organize such risks, private corporations – such as Accenture or Experian – are engaged. They use Big Data analytics to sort risky individuals into categories for differential treatment.
Data are increasingly drawn on to make inferences about persons and groups. The personal data of some may be used for the financial gain of others, raising social justice and civil liberties questions. Scoring is a key way of deciding who should receive what in terms of goods and services, or who might be a suspect or a criminal. The scoring is done using algorithms that process personal data in order to make predictions that may produce negative discrimination just because individuals are categorized as a member of a particular social group. This can affect access to healthcare, credit, insurance, social security, educational institutions, student loans and employment options. This is turn creates vulnerabilities such as unfair targeting by policing and security agencies.55
For example, a few years ago the UK government was concerned with so-called ‘high cost, high risk’ groups in British society who are vulnerable to ‘social exclusion’.56 One such group is young people classified as ‘NEET’ (Not in Education, Employment or Training), and according to one study a NEET seventeen-year-old is likely to cost the British taxpayer ten times more by the time he or she is twenty-eight than their counterparts in education, training or work, just because they may claim benefits, use health services, be involved with the criminal justice system or not pay taxes. Social intervention, even from the onset of pregnancy, is required to avoid social exclusion.
Thus locating, targeting, tracking and mapping the distribution of such groups is vital, as is extensive data-sharing to classify more carefully and to organize the necessary surveillance. In other words, once socially sorted, such groups – homeless people, drug users and previous offenders are viewed similarly – can expect greater and ongoing scrutiny, which may either exacerbate or ameliorate their situation.
One example does not tell the whole story, of course. Other government departments work in different ways, contributing to different outcomes. But the same basic processes are at work. In applications for a driving licence, the social sorting is fairly subtle, and produces different kinds of layering effects for persons with varying credit and purchasing histories. It is unlikely, of course, that licence applicants would guess that the speed of service received depends on apparently unconnected details like whether or not they have dealings with mail-order companies. What this does demonstrate is the increasing reliance of one agency or unit on data – and even data analysis – from another. Apparently trivial commercial data may be used to sort and make judgements about behaviours, both current and future.
On the other hand, decisions may be made to find data to control crime and violence in which some very telling judgements are made. Muslims living in Birmingham, UK, for example, found in 2010 that they were the particular targets of a relatively novel use of cameras set up for automated number plate recognition of their vehicles. These were disproportionately deployed in areas of Birmingham with high Muslim populations, under the rationale of attempting to combat antisocial behaviour, vehicle crime and drug dealing in the area, but actually paid for out of a Terrorism and Allied Matters Fund – a fact that was not made clear to the local population.57
Social sorting draws attention to the ways in which the processing of personal data for surveillance purposes of various kinds contributes to the drawing of social distinctions. People are sorted into social categories (these may include gender, socio-economic, religious and ethnic/national) so that their classification may be used to distribute opportunities and risks according to the criteria of the surveillant organization. The ‘pie slices’ are cut in a number of ways, sometimes subtly and complexly, but with very real consequences in the everyday world of work, travel, consuming and relating to official bodies. As the examples show, the distinctions are often reinforced in the process, as well.58
It is important to recall that, as the Surveillance Society report says, ‘no one voted for such systems. They come about through processes of joined-up government, utility and services outsourcing, pressure from technology corporations and the ascendancy of actuarial practices.’ Social sorting is nowhere an official, legislated process. It is one in which statistical categories determine differential treatment for different population groups, directly affecting their life-chances and opportunities. But while it clearly affects social ethics and justice, it is not in any sense subject to democratic participation.
What Charles Tilly once called ‘durable differences’ are now enabled by the ‘smartness’ of the internet of things and Big Data applications. The rules are now embedded in software protocols, making them even less visible to those whose daily lives are affected by them. But it is not impossible, as the geodemographic marketing example shows, that aspects of their data doubles may be made visible to the embodied persons that generate them. If this were to occur, it would bring new factors into the surveillance imaginaries and, potentially, new surveillance practices.
This chapter has explored some ways in which items that start life as attractive, even seductive, novelties end up contributing to a process of normalization of surveillance. In everyday life, people respond to surveillance as it is increasingly embedded in the contexts and routines of everyday life. The flagship of this process is the ubiquitous smartphone, but it is also evident in more abstract-sounding processes of smart city development, the internet of things and wearable devices in which communication between objects, and between objects and persons as well as between persons themselves, is creating new sorts of relationships between data and people.
One important way in which surveillance imaginaries and practices have taken on a new dimension is that they have become decidedly participatory. This is not to say that earlier forms of surveillance require no participation – you still have to be on the street for the camera to detect your presence or at airport security to have your bags and details checked, and to play your role. But today, participation is a much more obvious feature of surveillance – users know that their phones, wearables and other gadgets and platforms are interacting with their activities, even if they do not understand the extent of that interplay.
Participation has been noted here in several contexts beyond smartphones – facial recognition technology, building sensors, self-tracking devices, even semi-autonomous vehicles – but it is always negotiated between various actors whose intentions and actions still have to be explored and translated.59 This ‘participatory turn’, as Julie Cohen calls it,60 becomes even more evident in the world of social media and gaming that the following chapter explores. As yet, its implications are hard to understand by those involved with it, not least because the algorithms that guide their operation are opaque to users.
Conventional forms of surveillance, seen in national security or policing, are not generally associated with aesthetic pleasures and tend to elicit instead various kinds of caution and compliance. With the ongoing levels of innovation in new media, along with growing familiarity with everyday forms of baked-in surveillance, cultures of surveillance develop that are less anxious, often because they are consumer related and play on features such as convenience or conviviality. This means that they may themselves become quickly taken for granted as necessities for everyday living or that they may serve to domesticate aspects of more obviously law-and-order-producing surveillance.
But the key innovations of the twenty-first century, seen here in smartphones and social media, are associated with some deeper changes. On the one hand, so-called soft surveillance has some subtle ways of ensuring compliance, for instance through social sorting, that reduces the options open to consumers or users. And on the other, explored in the next chapter, is evidence that surveillance practices increasingly include more conscious complicity in our own surveillance and engaging in do-it-yourself surveillance. We look at each of these, and other aspects of participation, in the next chapter.