The end of capitalism has often been imagined as a crisis of epic proportions. Perhaps a financial crisis will occur that is so vast not even government finances can rescue the system. Maybe the rising anger of exploited individuals will gradually congeal into a political movement, leading to revolution. Might some single ecological disaster bring the system to a halt? Most optimistically, capitalism might be so innovative that it will eventually produce its own superior successor, through technological invention.
But in the years that have followed the demise of state socialism in the early 1990s, a more lacklustre possibility has arisen. What if the greatest threat to capitalism, at least in the liberal West, is simply lack of enthusiasm and activity? What if, rather than inciting violence or explicit refusal, contemporary capitalism is just met with a yawn? From a political point of view, this would be somewhat disappointing. Yet it is no less of an obstacle for the longer-term viability of capitalism. Without a certain level of commitment on the part of employees, businesses run into some very tangible problems, which soon show up in their profits.
This fear has gripped the imaginations of managers and policymakers in recent years, and not without reason. Various studies of ‘employee engagement’ have highlighted the economic costs of allowing workers to become mentally withdrawn from their jobs. Gallup conducts frequent and wide-ranging studies in this area and has found that only 13 per cent of the global workforce is properly ‘engaged’, while around 20 per cent of employees in North America and Europe are ‘actively disengaged’.1 They estimate that active disengagement costs the US economy as much as $550 billion a year.2 Disengagement is believed to manifest itself in absenteeism, sickness and – sometimes more problematic – presenteeism, in which employees come into the office purely to be physically present.3 A Canadian study suggests over a quarter of workplace absence is due to general burn-out, rather than sickness.4
Few private sector managers are required to negotiate with unions any longer, but nearly all of them confront a much trickier challenge, of dealing with employees who are regularly absent, unmotivated or suffering from persistent, low-level mental health problems. Resistance to work no longer manifests itself in organized voice or outright refusal, but in diffuse forms of apathy and chronic health problems. The border separating general ennui from clinical mental health problems is especially challenging to managers in twenty-first-century workplaces, seeing as it requires them to ask personal questions on matters that they are largely unqualified to deal with.
Lack of engagement from the workforce also registers as a problem for governments, inasmuch as it bites into economic output, and in doing so hits tax receipts. In societies with socialized health insurance and unemployment insurance, the problem is far more serious. There is a growing economic problem of individuals dropping out of work due to some often ill-defined personal and intangible problem, then gradually sinking into a more generalized inactivity. These people may show up at the doctor’s surgery on a regular basis, making complaints about undiagnosable pains and problems. This is often because they have nobody else to speak to and are lonely. Unemployment undermines their sense of self-worth, and inactivity brings various other psychosomatic problems with it. A general deflation of psychological and physical capacity is the end result, which in many societies produces costs for the state to pick up.
Nor is the economic threat posed by declining mental health confined only to the periphery of labour markets. The World Health Organization caused a stir in 2001 by predicting that mental health disorders would have become the world’s largest cause of disability and death by 2020. Already, some estimates suggest that over a third of European and American adults are suffering from some form of mental health problem, even if many are going undiagnosed.5 The economic costs this imposes are vast. Mental health disorders are estimated to cost 3–4 per cent of GDP in Europe and North America. In Britain, the overall cost of this to the economy (including various factors, such as workplace absence, reduced productivity, medical costs) is put at £110 billion per year.6 This is already far more than the economic cost of crime, yet it is a figure that is expected to double in real terms over the next twenty years, unless the current trend is diverted in some way.7
The causes of mental health problems are obviously complex and do not lie simply in the economy any more than they do in brain chemistry. But it is the way in which these problems manifest themselves in the workplace, threatening productivity as they do so, that has placed them amongst the greatest problems confronting capitalism today. It is the principal reason that the World Economic Forum is now so concerned about our health and happiness.8 The murky grey area separating workplace disaffection from a clinical disorder has required managers, and the human resources profession especially, to equip themselves with various new ways of intervening in the minds, bodies and behaviours of their workforce. The term most commonly used to describe the goal of these new interventions is ‘well-being’, which encompasses the happiness and health experienced by employees.
There is a clear economic incentive for managers to consider the positive attitude of employees. Endless studies have shown that workers are more productive when they feel happy, possibly by as much as an additional 12 per cent of output.9 And in workplaces where they feel respected, listened to, consulted and involved, they are more likely to work harder, and less likely to take sick leave. Where employees have no say in how their work is organized, this is known to generate some of the psychological problems that now concern businesses, up to and including mental health problems.10 By emphasizing well-being, managers hope to turn a vicious circle of disengagement and ill-health into a virtuous one of active, fulfilling commitment.
It is tempting to be cynical about some of this: the manager is after all still attempting to extract effort from the worker. But why not also recognize the opportunity contained in this current business anxiety? If capitalism is being ground down by the chronic, unspecifiable alienation of those it depends on, then surely solving that problem may also open up possibilities for political reform? The hard economic costs that ennui now places upon employers and governments means that human misery has shown up as a chronic problem that elites cannot simply shove aside. The question of what type of work, and what type of workplace organization, might generate a real sense of commitment and enthusiasm on the part of workers should not be abandoned altogether.
The difficulty is that the enthusiasm managers are seeking to promote is no less slippery than the psychosomatic problems they are seeking to avoid. A report commissioned by the UK government on the importance of employee engagement found it impossible to say exactly what this gaseous entity consists of. Expert insights that ‘you sort of smell it’ and ‘know it when you see it’ confirmed a shortage of objectivity on this particular issue.11 Managers and policy-makers yearn for a hard science of workplace happiness. But it is with that sort of hard science that many of our problems begin.
Happiness boot camps
Confronted by other people’s problems which are both ambiguous and personal, senior decision-makers have a tried and tested coping method: bring in the external contractors and consultants. There is copious political and market demand for experts willing to pronounce and act upon the well-being of others, on the basis of some presumed scientific authority. These sit on a spectrum between qualified medical practitioner and ill-informed bully. When handling painful issues of other people’s health and happiness, outsiders have the great advantage of being able to duck full moral accountability and, if necessary, withdraw from the job altogether. Bentham’s vision of a ‘National Charity Company’, a corporation established by the state to put people to work, foreshadowed today’s murky world of workfare that lies in the unaccountable gaps between market and state.
In its bid to push people off reliance on the welfare state and into the labour market, the UK government appointed the public service outsourcing company Atos to conduct individual ‘work capability assessments’ of individuals. As this agenda was ramped up by the Conservative-led government from 2010 onwards, it led to a number of tragedies and acts of cruelty. These included the suicide of a 53-year-old blind and agoraphobic man, Tim Salter, only weeks after his benefits were stopped in 2013, following an assessment by Atos that he was able to work.12 Atos also found individuals suffering brain damage and terminal cancer to be ‘fit for work’. In 2011, Britain’s General Medical Council investigated twelve doctors working for Atos as disability assessors, due to allegations that they were not performing their duty of care towards patients.13 Between January and November 2011, 10,600 sick and disabled people died within six weeks of their benefits being stopped.14 In one darkly comic computer malfunction, Atos confirmed that a disability benefit claimant was fit to work, even after they’d died of their illness.
When it comes to then motivating people to seek work, once again, the government also stands back, letting its contractors perform the most controversial psychological interventions. Those being forced to seek work are assessed, in terms of their attitude and optimism, and then have their motivation reactivated. The companies who carry out this task in the British context are A4e and Ingeus, who hold contracts with the government to get unemployed people into jobs. Around a third of the people who come through their doors report some sort of mental health problem, although the companies suspect that the rate is really twice that. Questionnaires are used to try and spot what the behavioural and mental obstacles are towards working (lack of jobs not being viewed as an adequate excuse).
In the eyes of these contractors, unemployment is really a ‘symptom’ of some broader personal malaise, which manifests itself in inactivity. The solution consists of a range of coaching programmes, combined with ‘behavioural activation’ courses, aimed at restoring the unemployed individual’s self-belief and optimism with ruthless efficiency. As one participant in an A4e course reported, they were shouted at by a self-help guru to ‘talk, breathe, eat, shit belief in yourself’ and that ‘you are the product – you either believe it or you don’t’.15
Wherever the economics of mental health become more explicit, the gap between care and punishment tends to shrink. In 2007, the economist Richard Layard laid out the ‘business case’ for cognitive behavioural therapy (CBT), demonstrating that it could save the UK government money, given the treatment’s brevity and apparent success rate in keeping people in work.16 This was instrumental in the creation of the Increasing Access to Psychological Therapies programme, which involved a dramatic rise in the number of cognitive behavioural therapists trained and employed by the National Health Service.
But with the dawning of austerity, this sympathy for talking cures started to look somewhat different. In 2014, the government announced that disability benefit claimants could have their payments stopped if they refused to attend sessions of CBT. People would effectively be forced to receive a talking cure. Quite how therapy could be expected to ‘work’, when it was only being undergone under the threat of losing £85 a week, was not explained.
To close down every route for the avoidance of work, doctors have had to be conscripted into this policy agenda too. A UK government report published in 2008 complained that ‘the fallacy persists that illness is incompatible with being at work’, which doctors were guilty of peddling.17 A government campaign was launched to dissuade doctors of this, and their official ‘sick notes’ (which were once signed by doctors to declare that an individual shouldn’t work) were replaced by ‘fit notes’, requiring doctors to describe the remaining ways in which an individual could still be employed, despite any illnesses or disabilities. Doctors were encouraged to sign a draft statement scripted by the state, agreeing that work is good for people.
At the opposite end of the labour market, things look a lot sunnier, but somehow no less brutal. While Atos, A4e and Ingeus grapple with the apparent sluggishness and pessimism of the poor, high-end wellness consultants make large sums of money by teaching corporate elites how to maintain themselves in a state of optimal psychosomatic fitness. Classes such as Dr Jim Loehr’s ‘Corporate Athlete Course’ ($4,900 for two and a half days) introduce executives to elite ‘energy investment’ strategies, which will enable them to achieve a high performance level of physical and mental wellness. The American productivity guru Tim Ferriss sells advice on how senior managers should best employ their own brains over the course of the working day, following an earlier career selling dubious brain-enhancing nutritional supplements.
This consultancy circuit moves seamlessly among various apparently separate domains of expertise. The psychology of motivation blends into the physiology of health, drawing occasionally on insights from sports coaches and nutritionists, to which is added a cocktail of neuroscientific rumours and Buddhist meditation practices. Various notions of ‘fitness’, ‘happiness’, ‘positivity’ and ‘success’ bleed into one another, with little explanation of how or why. The idea which accompanies all of this is that there is one ideal form of human existence: hardworking, happy, healthy and, above all, rich. A science of elite perfectibility is built on the back of this heroic capitalist vision. The flip side of this, and the real driving force behind many executive wellness programmes, is a set of well-researched risks run by highly competitive businessmen, colloquially known as ‘burn-out’, which includes the higher chance of heart attacks, strokes and nervous breakdowns.
Of course, the majority of adults living in capitalist societies lie somewhere between the purview of Atos et al. and that of the executive wellness gurus. Is there no scope for a less individualized vision of well-being across the middle swathes of the labour market? Possibly there is. But here too are some brutally competitive injunctions offered to those managers worrying about worker disengagement and its impact upon productivity.
One of America’s leading workplace happiness gurus, entrepreneur Tony Hsieh, argues that the most successful businesses are those which deliberately and strategically nurture happiness throughout their organizations. Businesses should employ chief happiness officers to ensure that nobody escapes workplace happiness. But if this sounds like the recipe for inclusive community, it isn’t. Hsieh advises businesses to identify the 10 per cent of employees who are least enthusiastic towards the happiness agenda, and then lay them off.18 Once this is done, the remaining 90 per cent will apparently become ‘super-engaged’, a finding which is open to more than one psychological interpretation.
As the science of happiness has moved closer to the front line of profit-maximizing business, something curious has happened to it. For Bentham, happiness was something which resulted from certain activities and choices. Neo-classical economists such as Jevons and behaviourist psychologists such as Watson assumed something similar, implying that individuals could be lured to make certain choices by dangling a pleasurable carrot in front of them. But in the context of business consultancy and individual coaching, happiness looks altogether different. Suddenly, it is represented as an input to certain strategies and projects, a resource to be drawn upon, which will yield more money in return. Bentham and Jevons’s psychological premise, that money yields a proportionate quantity of happiness, is spun on its head, suggesting instead that a quantity of happiness will yield a certain amount of money.
One of a new generation of positive psychology management gurus, Shawn Achor, outlines a range of data in his book, The Happiness Advantage, suggesting that happier people achieve more in their careers.19 They get promoted more, sell more (if they work in marketing) and enjoy better health. Happiness becomes a form of capital on which they can fall back amidst the turbulence of an uncertain economy. It is, as the title of his book suggests, a source of advantage in the battle to succeed. If this was the limit of his wisdom, Achor might sound like a fatalist: optimists are just luckier in all regards than pessimists.
The crucial supplement to the data is that we are all, supposedly, capable of influencing our own happiness levels. Happiness, Achor tells us, is a choice. We can either choose to be happy (and consequently successful) or choose to be unhappy (and suffer the consequences). Neuroscientist Paul Zak, another leading speaker and consultant on these issues, suggests that we view our happiness like a ‘muscle’, which needs exercising regularly in order to keep it in full working order, for when we need it. Lurking within this highly individualized agenda is the capacity to blame people for their own misery and failure, both of which are matters that they have evidently failed to act upon adequately.
What does ‘happiness’ even mean, once it is being conceived of in this way? It seems to imply a source of energy and resilience, but always directed towards goals other than being happy, such as status, power, employment and money. In the face of workplace ennui and psychological stagnation, the motivational gurus simply demand more willpower. By this account, the activities that might result in happiness, such as socializing or relaxing, are only valuable to the extent that they might restore brain and body to a level of fitness, from which they can then be propelled forwards to the next business challenge. This particular version of utilitarianism means expanding corporate rationality further into everyday life, such that there is now even an ‘optimal’ way of taking a break from work, and simply going for a walk can be viewed as a calculated act of productivity management.20 What is going on? The misery of working people is a serious political issue. How did it become captured in this way?
The extraction of effort
The discovery of the ‘conservation of energy’ in the 1840s, which so excited physiologists and philosophers such as Fechner, also unleashed a wave of enthusiasm among industrialists and inventors. If quantities of energy remained the same, as they passed between man, matter, heat and motion, then mathematical analysis could yield ever-more productive technologies. The search for ‘perpetual motion’ machines was a manifestation of this optimism.
However, this enthusiasm was soon tempered by a more troubling discovery made by the physicist Rudolf Clausius in 1865. It transpired that energy did not remain a single quantity after all, as it changed from one state to another. In fact it was gradually diminished. This was the law of ‘entropy’, and it catalysed an outbreak of anxiety and pessimism regarding the very future of industrial capitalism. During the 1870s, as Jevons was converting economics into a form of psychological mathematics, physiologists – and industrialists – were growing increasingly concerned by the problem of physical human fatigue, especially in the factory. The Victorians had tended to view inactivity and unemployment as moral failings, associated with drink and bad ‘character’. But by the 1880s, there was a creeping concern that industrial work was simply grinding people down. Human beings were running out of steam.
A fin-de-siècle neurosis developed. As capitalism’s human resources were diminishing, the vitality on which Western civilization depended was in terminal decline. The syndrome of ‘neurasthenia’, a form of nervous exhaustion supposedly brought on by the strains of modern urban life, claimed thousands of victims among the European and American bourgeoisie. Progress just seemed like too much effort.
The science of work at the end of the nineteenth century was not entirely dissimilar to how it looks today. Fatigue was a preoccupation, just as general inactivity (for the poor) or burn-out (for the rich) is today. This was viewed as a matter of national economic priority: variations in national economic output were attributed to differences in the physiology and nutrition of rival national workforces.21 As one study suggested, maybe Britain’s economic advantage over Germany was that its workers ate more meat, whereas the latter’s ate more potatoes. The science of ergonomics developed to study and photograph bodies in motion, in the attempt to spot precisely where energy was being wasted. The muscles, and even the blood, were examined, to try and understand how entropy afflicted the human body in the workplace.
This was the context into which the mechanical engineer Frederick Winslow Taylor launched his career as the world’s first management consultant. Taylor was born into a prominent and wealthy Philadelphia family, with roots stretching back to Edward Winslow, one of the passengers on the Mayflower. This heritage was crucial. It was his eminent family name which granted him privileged access to the industrial firms of the city, in ways that would be decisive for his career. During the 1870s and 1880s, Taylor worked for a number of successful manufacturing and steel plants in the area, achieving automatic promotions to managerial positions, on account of his family connections.
Taylor was never an industrialist as such – he’d originally hoped to become a lawyer. Like every management consultant who would come after him, this put him in an ambivalent position, both an insider and an outsider. This granted him an unusual view of the shop floor of manufacturing plants, which he was able to look down on from a dispassionate white-collar position, with an air of objectivity. He had power within a business, but he also had scientific detachment. And much of what he saw, from this observational vantage point, looked deeply wasteful. There was no methodical, scientific analysis going into the design of work processes. Managers had a given quantity of resources, and a given quantity of hours in the day, but seemed bereft of any mathematical logic through which to exploit these for the greatest output.
Taylor never remained in the employment of a single company for very long, again setting a precedent for the consultancy industry that came after him. He kept moving from one Philadelphia manufacturer to the next, amassing insights into what was preventing more efficient modes of workplace organization. It was only in 1893 that he formally established himself as an independent consultant and began to sell his knowledge. His business card read, ‘Consulting Engineer – Systematizing Shop Management and Manufacturing Costs a Specialty’.
In the late 1890s, Taylor was hired by Bethlehem Steel to study the manufacture of pig iron. This was the topic of his first quantitative, scientific analysis of ‘time and motion’ in the workplace, specifically looking at how to increase the amount of pig iron that labourers could load onto a wagon in a given day.22 He not only looked at the process of labour itself, but also the physical conditions of work, and the physical condition of the individual labourers. He broke production down into individual tasks to be logged and rationalized. Even if economics had recently been converted into a utilitarian study of consumption, the problems of industrial management remained thoroughly physical, of how to extract as much produce from as few machines and human bodies as possible. He claimed to have increased the average output of pig-iron handlers from 12.5 to 47.5 tons per day, purely by rationalizing their time, motion and monetary incentives.
The Bethlehem study turned Taylor into a celebrity in business and academic circles. In 1908, Harvard Business School offered an MBA for the first time, but without much of a clue as to what would go in it. As the world’s now pre-eminent scientist of management, Taylor was invited to lecture on the course and in 1911 published a synthesis of his various theories, The Principles of Scientific Management. Among businessmen, time and motion studies became all the rage, arriving in European factories in the years immediately prior to World War One.
While the immediate clients for Taylor’s services were interested in maximizing their business revenue, the political appeal of scientific management was extremely broad. American progressives believed that with greater scientific insight, corporations could be harnessed for the common good. Socialists, including Lenin, saw in Taylorism a model for how society itself could be run in an efficient manner, without reliance on markets.
Taylor himself also attached a loftier social purpose to his new science, believing that scientific management would spell the end of industrial conflict, substituting ‘hearty brotherly co-operation for contention and strife’. One of his professed advantages, when he entered firms as an outsider, was that he could avoid being dragged into industrial conflicts between management and labour and maintain a politically neutral position. In workplaces that had become conflict-riven, the consultant could have a tempering effect – though of course it was never labour that had invited the consultant to intervene in the first place.
The accident of Taylor’s aristocratic roots created a template for how management consultants have behaved ever since. McKinsey & Co., Accenture, PwC presume a similar form of privilege, promising to sprinkle expertise upon organizations and workplaces, then very often exiting before the results become too apparent. That may be Taylor’s most powerful legacy, because beyond that, the term ‘Taylorism’ has come to acquire some largely negative connotations. Even as companies continue to push surveillance and scientific analysis further into the lives of their workforce, now through digital data analytics and mobile devices, it has been deeply unfashionable for some time to hark back to the hard, scientific analysis of Frederick Taylor. The reason for this is simple: the brutalist approach to management is deemed to make people unhappy.
It would be perverse to defend Taylorism, but there was at least a transparency about its logic. Workplaces and managers existed to extract value, in the most efficient way possible. Workers were never expected to like this, which was a freedom of sorts. As Ian Curtis, the lead singer of Joy Division who hanged himself aged twenty-three, once said, ‘I used to work in a factory and I was really happy because I could daydream all day’. Labourers in a Taylorist factory brought their physical capabilities into work, to be exploited for sure, but were never expected to give anything more personal or intangible. And this is exactly why managers soon turned their backs on Taylor’s version of scientific management.
Psychology gets to work
In 1928, a researcher from Harvard Business School sat down with a young woman working in a telephone production plant in Cicero, Illinois, and asked her an unusual question: ‘If given three wishes, what would they be?’ The woman paused to reflect before listing her answers. ‘Health, to take a trip home at Christmas time, and to take a wedding trip to Norway next spring.’
The reason the question was unusual was that the researcher was not, ultimately, interested in the woman’s life or wish fulfilment. Like Taylor before him, he was interested in her productivity. The enthusiasm for Taylorism had waned considerably since its heyday in the years prior to World War One, but Taylor’s basic scientific ambitions were still largely unquestioned among management theorists. Only in 1927 Harvard Business School had established a Fatigue Laboratory, containing rooms of various temperatures and state-of-the-art instruments to study the reactions of the human body to different types of work and recuperation. In an economy still dominated by manufacturing and physical labour, physiology and infrastructure seemed to hold the key for unleashing better workplace performance. Managers did not consider the Christmas or travel plans of their employees to be any of their business.
The man asking the questions in that telephone production plant was Elton Mayo, an Australian polymath of somewhat dubious scholarly provenance. He had dabbled in philosophy, medicine and psychoanalysis, and was seduced by many of the doom-laden cultural critiques published in the years following World War One, such as Oswald Spengler’s Decline of the West. Mayo was convinced that civilization was heading for a fall, and that industrial conflict would be its trigger. Trade unions and socialists were thus a threat, not only to management and capital, but to world peace.
In some of Mayo’s more outlandish theories, socialism was a symptom of physical fatigue and psychiatric illness. ‘To any working psychologist’, he asserted, ‘it is at once evident that the general theories of Socialism, Guild Socialism, Anarchism and the like are very largely the phantasy constructions of the neurotic’.23 He believed that the only solution lay in corporations coming to provide forms of psychoanalytic therapy to their employees, which would soothe them, bringing them closer into the arms of their employers. Employees who resisted the authority of their managers were in need of treatment.
Mayo emigrated to the United States in 1922, firstly to San Francisco, where he took a visiting lectureship at Berkeley. He soon discovered that the Rockefeller Foundation was a source of considerable funds for anyone seeking to pursue business-friendly research, and he won a series of lucrative grants over the next twenty years, which kept him in some personal luxury. These studies took him to the East Coast, where he had the chance to visit a number of factories and consider how his ideas might be applied. His psychosomatic theories assumed that psychiatric problems in the workplace would show up not only in terms of low productivity and industrial unrest, but high blood pressure. Between 1923 and 1925, he toured manufacturing plants in the Boston area in the company of a nurse and a blood pressure gauge, attempting to prove this link between the mental, the economic and the physical, which he was convinced existed quite regardless of the evidence.
The psychological study of work was an emerging field during the 1920s, led by some of the same scholars who had previously pioneered the psychological study of advertising a few years earlier. But Mayo had some much more far-reaching theories regarding the ways in which the insights of psychology might fundamentally reform and rescue capitalism. By focusing on the entire person in the workplace, including all of their personal concerns and mental well-being, work might provide the labourer with their deepest source of meaning, and offset the risk of industrial upheaval once and for all. In 1926, Mayo was hired by the Harvard Business School.
The research in Cicero, Illinois, known as the Hawthorne Studies, after the name of the manufacturing plant where they were carried out, quickly became a landmark of management science.24 Mayo was one of the founders of the Fatigue Laboratory, but the impact of his work was to divert attention away from the working body and towards the mental happiness of employees. According to the mythology that now surrounds the Hawthorne Studies, Mayo’s main discovery was accidental. The working women who were chosen to be observed and interviewed were taken off the regular shop floor and into a test room, where they were able to relax and interact in a more informal and convivial atmosphere. This seemed to correlate with improved performance, and Mayo had an inkling of why: the study itself, including the interview process, was what resulted in the productivity increases, because the women had developed a higher sense of group identity with one another. Their enthusiasm for work had grown, as their ability to form relationships with one another increased. The general phenomenon, whereby research subjects respond to being studied, is now known as the ‘Hawthorne Effect’ for this reason.
The lesson that Mayo drew from his repeated visits to the Hawthorne plant was that managers had to learn how to talk to their employees if they wanted to extract greater productivity from them. An unhappy worker was also an unproductive worker, and the unhappiness stemmed from a deep-set feeling of isolation. They also had to understand the unique psychological properties of social groups, which were not simply reducible to individual incentives, as Taylorism and neo-classical economics had supposed. A thriving and collaborative group identity could do far more for an employee’s happiness, and hence for the manager’s bottom line, than a pay rise.
There is some basis to doubt whether Mayo was really reporting on data acquired at Hawthorne or simply repackaging some theories that he’d long held about the future of capitalism. In fact, the productivity of the women did coincide with a pay increase in 1929, but Mayo was absent at the time and chose to ignore this in his analysis.25 Regardless of the scientific validity of his work however, Mayo’s impact on management thinking was profound and long lasting. Whenever we now hear that managers must focus on the ‘whole person’, and not just the ‘employee’, or that employee happiness is critical to the bottom line, or that we must ‘love what we do’ or bring an ‘authentic’ version of ourselves to work, we are witnessing Mayo’s influence. When managers strive for more laughter in the workplace, as some consultants now insist they must, or seek to transform its smell so as to optimize our subjective feelings, they are practicing what Mayo first preached.26
Therapeutic management
Within the longer history of happiness expertise, what is interesting about Mayo’s intervention is that he downplayed the more obvious material ways of tweaking the pleasures and pains of the mind. Neither money nor the physical body were deemed adequate for understanding or influencing levels of happiness, once the workplace came to be understood in terms of group psychology. Instead, talking to workers and facilitating their relations with one another became the main instruments for gauging and improving their happiness. Management, which originated as a technique for controlling slaves on plantations, and developed as a means of running heavy industrial corporations, had become a ‘soft’, social and psychological skill.
While Mayo did not conceive of things in quite this way, this was a form of psychosomatic intervention, like a placebo. The aim of management in the 1930s was, after all, still the same as it was in Taylor’s day: to increase output of physical produce. But now, rather than focusing on the physical and physiological work process, managers would focus on the social and psychological elements, in the expectation that this would yield behavioural, physical, economic improvements.
The term ‘psychotherapy’ today refers to a range of treatments, ranging from more psychoanalytic, long-term relationships, to the quick fixes such as CBT that are more akin to training or coaching. But the first known uses of the term referred to the ‘talking cures’ offered by medical doctors in the late nineteenth century, who came to recognize that their patients often responded as much to the manner in which they were spoken to, as they did to the medicinal treatment they received.
What Mayo was recommending was the industrial parallel to this. An open, conversational relationship could be conducted in such a way as to bring about a change in the worker’s mentality, and a consequent change in their physical performance. Speech was instrumentalized, to make people feel better, and as a result, behave better. As a tonic to the harsh mechanics of Taylorism, this made perfect sense. It could even be taken in some more emancipatory directions, to investigate groups as autonomous entities, which might allow firms to be more democratically managed in future. Research on group psychology was put to various uses over the 1940s and 1950s, from the analysis of tank commandment during the war, to the analysis of consumers via focus groups.27
Mayo personally hoped to anaesthetize political sentiments. Therapeutic management would reduce unhappiness and, with it, resistance. Other avenues were possible, however. Once dialogue and co-operation become viewed as an essential element of economic production, one sees the glimmer of a more transformative economic democracy. Once the woman working on the shop floor is asked what her three wishes are, might the next step not be to invite her to have a say in how the business is managed? And might things not progress politically from there? Mayo would have scoffed at the idea. But the critique of management oligarchy cannot discount the emancipatory potential of social psychology altogether.
Yet the analogy to psychosomatic medical treatments would gradually become more telling as the post-war period progressed, for a couple of coincidental reasons. Firstly, the nature of work in the West became progressively less physical over the second half of the twentieth century. By the 1980s, an employee’s customer care, service ethic and enthusiasm were not simply mental resources, which existed to help churn out more products: they were the product. The importance of employee happiness and psychological engagement becomes all the greater once corporations are in the business of selling ideas, experiences and services. Businesses speak of ‘intangible assets’ and ‘human capital’ in the hope of capturing this amorphous workplace ethos, but in practice it is nothing which resembles either an asset or capital. Some other way of conceiving of work is required.
Secondly, the concept of health started to undergo some profound changes. In 1948, the newly founded World Health Organization redefined health as ‘a state of complete physical, mental and social well-being’ – an almost utopian proposition that few of us ever attain for very long. Intangible aspects of health and illness came to the fore. In particular, the notion of ‘mental illness’ emerged concurrently with the decline of mental asylums, a category that could be applied liberally to people living relatively ordinary lives in the community, not unlike sufferers from common bodily illnesses.
The awareness that mental processes were a crucial component of health exerted a profound influence across health policy and medical practice, altering the nature of medical expertise as they did. This was sometimes known as ‘experience medicine’, as it brought the experience of the patient, and not just their body, into the medical assessment for the first time. By the 1970s, there was a range of quality of life measures that were used to assess health outcomes, which took into account the subjective perspective of the sufferer, and not simply their physical condition.28 In place of binary analyses, between life and death, health and disease, new sliding scales of wellness were emerging. This is partly a symptom of medical progress: as medicine becomes better at preventing death, so attention turns to the question of how well it is able to support life.
What does any of this have to do with management or work? The problem confronting managers and policy-makers over the second half of the twentieth century was that everything seemed to be evaporating into thin air at the same time. Work was becoming intangible as manufacturing went into decline. Illness was becoming intangible as mental and behavioural problems increased. Money itself was becoming intangible as the financial system globalized from the late 1960s onwards. Problems of activity and enthusiasm moved elusively between the domains of medicine, psychiatry, workplace management and economics. The challenges of health care and those of business were becoming harder to disentangle, with the issue of mental health at the interface between the two. The job of management increasingly came to resemble psychotherapy in that original sense of ‘talking cure’, of propping up the well-being of individuals, in order to keep their enthusiasm for service-based jobs as high as possible.
And as the nature of work and management changes, so too does the nature of resistance. Opposition to management typically takes a form other than that preferred by the manager himself. The classical mode of opposition to Taylorism, which seeks to reduce human beings to physical capital, is for the worker to speak back or strike via a trade union. The manager, having ignored the feelings or desires of the worker, is told that they cannot do so any longer.
As Mayo’s style of therapeutic management expanded over the post-war period, opposition to it began to take the opposite form. Gradually, as post-industrial workers were encouraged to be ‘themselves’, speaking ‘openly’ and ‘honestly’ to their manager, the sole remaining form of opposition was to return to the physical body once more. The only escape from a manager who wants to be your friend is to become physically ill. With the list of available diagnoses growing, and complete ‘health’ becoming idealized, sickness became one of the dominant ways in which refusal to work came to manifest itself, especially from the 1970s onwards. Evidently, management could not only focus on relationships and subjective feelings, any more than it could only focus on the productive body. What it needed, if it was to ensnare employees thoroughly, was a truly psychosomatic science that could treat the mind and the body as an integrated part of a single system to be optimized. This brings us to a final character in the story of psychosomatic management.
Holistic work and well-being
In 1925, a nineteen-year-old Austrian medical student at Prague University named Hans Selye noticed something so obvious that he almost didn’t dare report it to his teacher. As his class was observing various patients with a range of different maladies, it dawned on Selye that all of the patients bore some resemblance to each other, regardless of their medical condition. They each reported aches and pains in the joints, loss of appetite and had a coated tongue. In short, all of them looked ill.
He later recalled this moment as follows:
Even now – after half a century – I still remember vividly the profound impression these considerations made upon me at the time. I could not understand why, ever since the dawn of medical history, physicians should have attempted to concentrate all their efforts upon the recognition of individual diseases and the discovery of specific remedies for them, without giving any attention to the much more obvious ‘syndrome of just being sick’.29
When he did share the insight with his professor – namely that sick people look sick – he received the sarcastic reply that, indeed, ‘if a man is fat, he looks fat’. But Selye refused to abandon his insight. As a child he had accompanied his father, one of a long line of doctors in the Selye family, on his visits to poor households in Vienna, and had a strong vocation towards a traditional, somewhat holistic, understanding of the healing process.30 As the medical ‘psychotherapists’ had realized, the doctor’s personal interaction with the patient was a crucial ingredient in how they responded to treatment.
The history of utilitarianism is littered with dashed hopes that there might be a single measure of human optimization which could serve as the instrument through which all public and private decisions might be taken. This ideal rests on the hope that the ambiguity and plurality of human culture might be overcome through knowledge of a single quantifiable entity. Whether it is via the idea of utility, energy, value or emotion, the project of monism always involves this form of simplification. In his apparently banal observation that ill people look ill, Selye had hit on another version of this. It took him another ten years before he had developed this into a scientific theory, which he termed ‘General Adaptation Syndrome’.
The novelty of this idea, from the perspective of medicine, was that the syndrome Selye was describing was non-specific: it had a common set of symptoms, but these were not tied firmly to any particular causes or disorders. He explored this doing various experiments on animals, plunging them into cold water, cutting them, implanting poisons into them, to see how quite disparate forms of brutality could prompt the identical modes of biological response.
Like any biological system, an animal body experiences various external stimuli, intrusions and demands which it has to respond to. What Selye was interested in was the nature of this response, which could sometimes become a problem in its own right. Biological systems which are overstimulated start to shut down; the same also happens when they are under-stimulated. The health of any organism depends on an optimal level of activity, not too much, not too little. Humans were no different, as far as Selye was concerned. The patients who simply ‘looked ill’ in his medical class that day were all displaying a common form of physical reaction to a very diverse set of illnesses. A monistic theory of general wellness was emerging.
Until the 1940s, the term ‘stress’ was used principally in reference to metals and was virtually unknown outside the worlds of engineering and physics. An iron bar becomes ‘stressed’ when it is unable to cope with the demands that are placed on it. Selye recognized that what engineers saw as ‘wear and tear’ in, say, a bridge, was the same problem as what he had termed ‘General Adaptation Syndrome’ in the human body. General Adaptation Syndrome was effectively an indicator of the ‘rate of wear and tear in the body’.31 In the aftermath of World War Two, he re-christened the syndrome as ‘stress’. By the 1950s, this was a distinctive new field of medical and biological research.
Like Mayo, Selye never saw himself as an academic only: he was on a mission. According to his holistic understanding of illness, entire societies and cultures could become sick if they lost the capacity to cope with external stimuli and demands. Equally, they could slump into passive inactivity if they were never stimulated sufficiently. As he grew older, Selye developed this idea into something approaching an ethical philosophy, though a frighteningly egocentric one. A healthy society, he argued, is built around ‘egoistic altruism’, in which every individual sets about doing his utmost to win the adoration of others. This produces a form of natural equilibrium, in which the egotist becomes integral to his own social system:
No one will make personal enemies if his egotism, his compulsive hoarding of valuables, manifest itself only by inciting live, goodwill, gratitude, respect, and all other positive feelings that render him useful and often indispensable to his neighbours.32
Despite his aspiration to offer a science capable of diagnosing every social problem, Selye stuck firmly to biology when it came to seeking explanations. His characteristically monistic assumption was that any society or organization was merely a larger, more complex biological system, whose behaviour could be reduced back to the actions of organisms and cells.
Away from Selye’s own biological research, and his macho libertarian politics, the non-specific nature of stress represented an opportunity which would eventually permeate into the world of management. Stress, as Selye had argued, is simply a particular type of reaction to any excessive demand. This was equally amenable to psychological or organizational forms of exploration. In fact, without using the term ‘stress’, the US military had become aware of the same syndrome during World War Two, in the common forms of psychological collapse experienced by soldiers who had spent too long in battle. The stressful demands placed on a human being are not merely physical, but social and psychological too. What went on between the demand and the response was open to a range of different scientific explanations beyond merely biological ones. The study of stress became an expressly interdisciplinary field.
As the study of how humans cope with physical and mental demands, it also lent itself perfectly to the study of work. By definition, stress is something we encounter without having chosen to, but cannot avoid. It often occurs when we are trapped in a certain situation, simply forced to react to it. The field of occupational health emerged during the 1960s to understand precisely how work impacts upon us, physically and mentally. Studying how different types of job demand produce different hormonal and emotional responses yielded a number of potentially transformative findings. It wasn’t simply that excessive demands were bad for people; insufficient workplace demands – or boredom – could also be unhealthy, as Selye had recognized. Our current concern with unemployment as a potential health risk is one manifestation of the latter anxiety.
Just as Mayo’s emphasis on dialogue created an opening for a more thoroughly egalitarian critique of business hierarchy, the study of stress in the workplace achieved something similar for a while. Work carried out by the psychologist Robert Kahn and his colleagues at the University of Michigan during the early 1960s highlighted the various ways in which power structures and work design impact upon the health of employees.33 Badly designed jobs and lack of proper recognition in the workplace were clear contributors to physical and mental ill-health. Lack of any influence over where and when one carries out a task is a stress factor, which takes its toll on both mind and body. A number of clear routes, between the injustices of hierarchical business and the vulnerabilities of the human body, were becoming apparent. One of the most important of these was the discovery that stress leads to the cortisol hormone being released into the bloodstream, hardening the arteries and increasing the risk of heart attack.34 Despite the high-profile obsession with executive burn-out, this form of stress is far more common for those lacking power or status at work.
By the 1980s, the non-specific syndrome that Selye had first identified in his lecture hall in 1925 had become one of the most pressing problems confronting managers in the Western world. Workers were no longer reporting straight-forward physical fatigue of the sort that Frederick Taylor might have understood; nor were they simply unhappy in a way that Elton Mayo might have recognized. They were now exhibiting a generalized deflation of activity, a form of psychosomatic collapse that we have come to identify with the concept of stress. In the UK, stress overtook repetitive strain injury in 2012 as the leading cause of absence from work. This is not easily classified as either a physical illness or a mental illness. What prompts it may include work but may equally include other types of social, psychological or physical demands that the individual simply can’t cope with.
The science of stress was of the utmost importance for managers worrying about the depletion of their workforce. It became one of the main preoccupations of the human resources profession, who sought out rudimentary wisdom on a wide panoply of ‘bio-psycho-social’ complaints. The sheer breadth of contributory factors to stress – some tangible, others intangible – made it extremely difficult to achieve any control over it. This is in addition to the graver psychosomatic risks faced by those in precarious jobs, who move in and out of work, without even managers to support them from one month to the next. One conclusion to draw from this would be, as per the occupational health studies of the 1960s, that the fundamental politics of work had grown dysfunctional and needed a more wholesale transformation, and not simply piecemeal medical treatment. But would this be the lesson that was learnt?
Taylor’s revenge
When the young woman in the Hawthorne plant informed Elton Mayo in 1928 that she was hoping to visit Norway for a wedding, this would have represented an unusual level of intimacy, had Mayo been her boss. In the early twenty-first century, managers in large corporations pursue a very different form of intimacy with their employees.
Consider Unilever, the global manufacturer of food, beauty products and cleaning products. In 2001, its senior management demanded a programme to help them personally manage their own energy levels, as they feared the consequences of executive working lifestyles.35 Being in the industry they were in, there was ample expertise to help them design this. The ‘Lamplighter’ health and well-being programme (named ‘Ignite U’ in Australia) was the result, tailor-made to help senior management keep up their performance levels and offset the risk of stress. The business benefits for Lamplighter quickly became clear, with evaluations suggesting that every £1 spent on the programme yielded £3.73 in return. It was quickly rolled out across dozens of Unilever offices around the world before being extended to cover the rest of the workforce.
Programmes such as Lamplighter are becoming more and more common. They seek to identify a wide range of health and well-being risks in their workforce, including the sporting activities of employees and their ‘mental resilience’. Lamplighter requires Unilever employees to be formally (albeit, confidentially) assessed in terms of a range of ‘behaviours’, relating to nutrition, smoking and drinking, exercise and personal stress. The state-of-the-art workplace of today has taken on features of the doctor’s surgery, just as the doctor has been required to take on skills of the motivational manager. What are referred to as ‘Health 2.0’ technologies for the digital monitoring of well-being are often indistinguishable from productivity enhancements. The iPhone 6’s Health app, launched in September 2014, was celebrated as another example of Apple’s reimagining of our everyday lives, without much pause to think who it had really been designed for. Needless to say, employers, health insurers and wellness service providers are amongst the main enthusiasts for the phone’s constant measurement of bodily behaviour.
Many ‘best practice’ employers now offer free gym membership to their most valued staff, and even free counselling. Business services, such as Virgin Pulse (a telling name, seeing as pulse rate represents life in its most quantifiable form), offer an integrated suite of psychosomatic programmes aimed at optimizing their physical energy, their attention span and their ‘true motivations’, through extensive digital surveillance and coaching. As the physical and the psychological character of work – and of illness – start to blend into each other, notions of ‘health’, ‘happiness’ and ‘productivity’ become ever harder to distinguish from each other. Employers end up treating all three things as a single entity, to be maximized via a range of stimuli and instruments. This is the monistic philosophy of the twenty-first century manager: each worker can become better, in body, mind and output.
The political hope that perhaps the human benefits of dialogue and workplace empowerment might be more thoroughly recognized turns into disappointment, as performance management and health care are fused into a science of well-being optimization. And yet there are radical political economists for whom the de-materialization of contemporary work represents an opportunity for a whole new industrial model.36 The shift towards a ‘knowledge-based’ economy, in which ideas and relationships are key sources of business value, could be the basis of entirely new workplace structures in which power is decentralized and decisions taken collaboratively. There are good reasons to suspect that such models might produce fewer psychosomatic stresses; in that sense, they may be more efficient than the status quo. If dialogue in the workplace is a necessary factor for productivity – as Mayo recognized – why not grant it some real influence over how decisions get made, right up to the highest level? Rather than ironic management speak, which twists words to manipulate emotions in the expectation that this will yield greater output, a more honest reflection on the problems of occupational ill-health would question the hoarding of status and reward by a small number of senior managers. Instead, traditional forms of management and hierarchy are rescued by the new ubiquity of digital surveillance, which allows informal behaviour and communication to be tracked, analysed and managed.
Rather than the rise of alternative corporate forms, we are now witnessing the discreet return of the ‘scientific management’ style of Frederick Winslow Taylor, only now with even greater scientific scrutiny of bodies, movement and performance. The front line in worker performance evaluation has shifted into bodily-monitoring devices, heart-rate monitoring, and sharing of real-time health data, for analysis of stress risks. Strange to say, the notion of what represents a ‘good’ worker has gone full circle since the 1870s, from the origins of ergonomic fatigue studies, through psychology, psychosomatic medicine and back to the body once more. Perhaps the managerial cult of optimization just needs something tangible to cling onto.