8

The Ethics Industry

Oppression can be defined as a situation where bias held against a section of people becomes elevated to general wisdom, with social and human rights denied to them as a result, whether this be implicitly or explicitly. We have seen how AI underpins online manipulation, social control and the radical transformation of labour. We have looked at how ignorance around the risks plays into the hands of companies inundating the market with often inaccurate products with little accountability and oversight. These products progressively software us out, make decisions for us, decide whether we are allowed a visa, a loan or a job interview. All this relies on us not noticing, not understanding and thus, not challenging.

So the crucial question arises for women worldwide: is AI simply going to encode oppression by embedding it even further as every aspect of our lives is progressively taken over by software? The ‘matrix of domination’, a phrase coined by academic Patricia Hill Collins, demonstrates how class, race and gender are interlocking systems, so those who find themselves at the intersections are hit hardest.112 The truth of this is well illustrated by the fact, already discussed, that AI-driven facial-recognition systems do not work on black women, or in the way that AI is used to tighten social control around poorer and more vulnerable citizens.

Neither the zeitgeist around ethics nor the rallying cry for there to be more women in technology is sufficient to alter the alarming direction it is taking. That is because it is so easy to cut corners when it comes to ethics.

Ethics, as philosopher Luciano Floridi explains, is increasingly at risk of turning into ‘ethics bluewashing’, where ethical principles are established to justify and further commercial aims, and not the other way around.113 The proliferation of ethical tools, often under the umbrella of the same large organizations that have dubious privacy practices, does make people wonder how sincere those efforts are, even whether they are a cynical ploy to distract consumers’ attention from the truth.

‘I’d blush if I could.’ That is how, until recently, Alexa responded to being called a ‘bitch’.114 Social pressure and international outrage finally made Amazon change this default response, but, despite a last-minute patch, the problem of female-gendered interfaces remains.

There are strong commercial incentives behind choosing characters and creating artefacts that the user might feel more comfortable conversing with, thus encouraging engagement and accelerating the accumulation of data. Perhaps, technology artefacts are given a female identity to help seduce a user into thinking it is pliable and benign. Seductive and flirtatious Alexa, Siri and Cortana all have female names. They are there to serve, support and help us with all our activities.

Our five-year-old boys grow up thinking that it is normal to have a female ready, at all times, to serve you, and our little girls may perhaps find it natural to think that this digital servitude extends to the real world.

Once, I was talking about this on the radio, and the presenter interrupted, sounding exasperated, ‘What do you have against the fact that people trust female voices, and find them more reassuring?’ Here is the problem: the female cyborg – whether Alexa or Sophia, the robot with ‘empathy’ – smiles and supports but is not entitled to challenge, and that is why it is considered reassuring. The female cyborg obeys.

We have not moved far from Galatea, Pygmalion’s beloved statue, and perhaps the prototype of today’s sex robots. It should alarm us all that a 2017 survey showed that 40 per cent of men were considering buying a sex doll in the next five years.115 The reasons vary, from sexual aids to loneliness and social anxiety, with 11 per cent admitting they prefer dolls as a replacement for a human.116

Samantha is a smart sex doll. It responds to human touch, is blonde and skinny and has sensual lips. When it was introduced at the Ars Electronica Festival in Linz, Austria, in 2017, showcasing the developments in animatronics and AI to make sex dolls ever more lifelike, Samantha didn’t remain intact for long. Men explored the doll so aggressively that its body was ruptured in several places. Think about this for a moment: the objectification of a female-looking body exposed and available for males to savage.117

In 2017, a company in California produced Harmony, a prototype hyper-realistic sex robot that can tell jokes, quote Shakespeare and remember birthdays. It can also hold a conversation about music, movies and books. And, of course, Harmony will have sex whenever its owner demands.118

Unsurprisingly, smart-doll brothels are popping up in quite a few locations. KinkySdollS, a Canadian firm, attempted to launch in Houston as part of a US expansion. Opposition from the mayor and from Christian groups meant that the project was eventually halted.119

It has been argued that sex robots will curb the trafficking of women and girls,120 but as of yet there is no evidence for this argument. What is far more likely is that this form of 3D pornography will only increase women’s subjugation and the violence of men’s sexual fantasies. The point here is that the more sophisticated these artefacts become (owing to the progress in animatronics and AI), the more difficult it will be to eliminate their gendering and sexualization.

The, now familiar, answer seems to be that we need more women in technology. I think this is right – but also misleading. There is no doubt that the relationship between women and technology has been complex over the years, and I feel some sympathy for those who have been reluctant to engage with tech as a conscious or unconscious response to military and scientific abuses within the patriarchal system.

A couple of years ago, I flew to Sweden to address coders and developers on the opportunities and challenges of AI. I remember being shocked at the sight of how male-dominated the audience was in the country of Pippi Longstocking and a statutory full year of paid paternity leave. I remember debating these issues at a conference and someone raising an interesting point of view: that the most gender-equal nations, places like Sweden, with comparatively more women in politics and on company boards, have a lower number of women in tech, as in these places women can exercise choice and focus on something more inherently feminine like art and creative occupations.

While some find this theory attractive, it is clear to me that the reason for this imbalance has more to do with culture and, in particular, with the binary Western approach to the world: women as emotional and irrational; men as rational, thus numerical. Women have disappeared from the history of computing with the relatively recent professionalization and managerialization of coding.

It wasn’t always thus: women were in computing when it was viewed as an extension of secretarial work; women were at Bletchley Park, working with Alan Turing and computing manually what is now done by machines. Performing those complex computations was perhaps viewed as scarcely different to the methodical work of a secretary, and women, used to embroidery and knitting and the precision and patience that these tasks require, were considered perfect for this, too.

At some point, with powerful coding machines being introduced by companies like IBM, humans were no longer needed to compute – complex machines could do that for them. That is when the number of women started to progressively decrease. This didn’t occur by chance. The higher-status computational work remaining was taken over by male managers.

Computing was replaced by the commodification of data, soon to be turning into capital – the logic of accumulation once again favouring men. The culture of the programming environment didn’t suit women: testosterone has been, and remains, dominant.

A recently circulated internal document has exposed that one year after Google staff’s #MeToo walkout, little has changed.121 In November 2018, 20,000 employees around the world walked out of Google’s offices in protest at the revelation that the company had paid out over $100 million to multiple executives accused of sexual harassment.122 At the time, the company apologized profusely, saying that it would overhaul its policies and make Google a safe place to work.

A year later, employees spoke of being silenced, of suffering retaliation and of still being too worried to officially report any harassment. This culture is not uncommon in the tech sector and is a problem that is putting women off entering it. This is important as we grapple with the need for more women in technology to design the tools that are shaping so much of our lives today.

Perhaps structural defects leading to bias and algorithmic racism would not happen with a diverse workforce, one that looks at the intended and perhaps unintended consequences of a product when deployed in our world. More women of colour in coding would surely mean that greater attention would be placed on data sets, processes and possibly distorted outcomes discriminating against whole sections of our society.

However, we cannot and must not let ourselves be fooled that as long as women are in the operating room then everything is going to be fine. AI is far more than technology, and it is way beyond technology that we must set our sights if this is not to be yet another setback for women from every walk of life.