In 2005 a ‘European Agency for the Management of Operational Cooperation at the External Borders of the Member States of the European Union’ was brought into being. Behind this cumbersome, bureaucratic-sounding name lies a highly dynamic institution, which is supposed to make the EU’s external border controls more robust and effective. At present it has a staff roughly a hundred strong and is planning for a pool of 500 to 600 border police, drawn from the member states and – a legal novelty – empowered to take on functions outside the EU. The agency also has at its disposal more than twenty aircraft, thirty helicopters and over a hundred ships, as well as elaborate equipment such as night vision devices and state-of-the-art laptops.
Since the official name is obviously too unwieldy, a catchier and evocative abbreviation has now been agreed upon: Frontex (from the French frontières extérieures). It works closely with other agencies such as Europol, advises local border police at key areas of illegal migration, and assists in ‘joint removal operations of third-country nationals illegally present in the Member States’.1 Such persons are those who, having somehow reached an EU/Schengen country and been refused asylum, are liable to be shipped back or, in official parlance, ‘repatriated’ to their country of origin.2
The Schengen Agreement, which came into force on 26 March 1995, has concentrated the frontier problem in states on the outer edges of the EU, passport-free travel now being the rule within the Schengen area. The ‘country of origin regulation’, however, requires asylum-seekers to give proof of political persecution if they come from a country classified as ‘safe’; and the ‘third country regulation’ provides that individuals who have, for example, managed to reach southern Spain from Sierra Leone and then moved on to Germany may be summarily sent back to Spain and refused the right ever to apply for asylum in Germany. Not surprisingly this has stepped up the pressure on the EU’s Spanish and Portuguese as well as East European frontiers, while applications for asylum in Germany have fallen by a quarter since 1995. But it also raises the question whether, in view of the rising numbers of refugees (set to rise even more as a result of future climate change), it will be possible to secure the EU’s external frontiers as effectively as this is done at present.
Frontex, established by a decree of the European Council, chalked up some early successes – for example, a major cut in the number of refugee boats landing in the Canary Isles. The refugees who make the 1,200-kilometre journey, mostly by dinghy, across the open sea from West Africa to Gran Canaria or Tenerife come from countries where the existing conditions make life virtually impossible. Displaced by dam projects or civil war, they have drifted into megacities like Lagos, where 3 million people live in slums and there is neither running water nor a sewage system. There they pay smuggling gangs an exorbitant sum for a place on an overcrowded, barely seaworthy boat, with no return ticket and a high risk of not surviving the trip.3 Even so, some 30,000 made it alive to the Canaries in 2006, posing considerable problems for the authorities and the tourism industry there.
Other refugees try the Straits of Gibraltar, which, though only 13 kilometres across, have strong currents and dense traffic that make them no less hazardous. Many fail to reach the shores of Spain or Portugal, and those who do are usually shipped straight back; it is estimated that some 3,000 drowned in the attempt in 2006 alone. Frontex takes account of this, by defining one of its important tasks as ‘preventing illegal entry in life-threatening conditions’.4
Since the reasons why refugees want to reach Europe at any price remain the same, and since their routes become more dangerous as Frontex increases its efficiency, the ideal form of control is to project the EU’s borders outwards, preventing refugees from ever leaving the African continent. As long ago as October 2004 Otto Schilly, then German interior minister, suggested building reception camps in Africa and verifying there whether an asylum application was valid or not.5 Most other EU interior ministers were not at all keen on the idea, and it also ran into protests from human rights organizations. The search for other solutions has been tough, as have negotiations with the African Union, so that there is still no alternative to the further tightening of border controls if such people are to be kept out of Europe. The situation in the Spanish exclaves of Ceuta and Melilla perfectly symbolizes the problem: border installations are continually being reinforced there, while refugees take ever more desperate measures to climb the fences; a mass storming in September 2005 involved approximately 800 people.
In the medium term, innovative technologies have given affected countries some relief: for example, a $2 billon system on the US border with Mexico will make it possible to locate intruders by GPS and to livestream the information to the next police patrol, dramatically reducing the number of illegal entrants. In 2006 no fewer than 1.1 million people were arrested in this frontier zone. In September 2006, the House of Representatives approved a plan to build a 1,125-kilometre high-tech fence to bolster security further. It is true that the total length of the frontier is 3,360 kilometres, but it is assumed that such measures will deter many potential violators, since the remaining areas consist of barely negotiable desert or mountain; the shortest distance across on foot is 80 kilometres. Between 1998 and 2004, a total of 1,954 people died along the US–Mexican border.
America and Europe will have to do more in future to protect themselves from the inrush of the millions of refugees who are expected to follow climate change. Hunger, water problems, wars and desertification will exert incalculable pressure on the islands of West European and North American prosperity. The German government’s Scientific Advisory Committee on Global Climate Change (WGBU) has pointed out that ‘1.1 billion people currently lack secure access to drinking water in sufficient quantity and quality’. This situation ‘may grow worse in some parts of the world, because climate change may lead to great variations in precipitation and water availability.’6
In addition, some 850 million people around the world are under-nourished – a figure which experts think will grow considerably in the wake of climate-induced shrinkage of farmland. The ensuing distribution conflicts point to a greater risk of violent escalation, since further population movements will increase the number of so-called migration ‘hot spots’. In this light, the WGBU urges, the promotion of development should be understood as a form of ‘preventive security’.
Such trends give a foretaste of what will happen when climate change boosts the flow of refugees. Space and resource conflicts due to global warming will fundamentally alter the shape of Western societies in the next few decades; Frontex is a nugatory harbinger of things to come. Climate change is therefore not only an extremely urgent issue for environmental policy; it will also be the greatest social challenge of the modern age, threatening the very existence of millions of people and forcing them into mass migration. The question of how to cope with such flows will become inescapable as refugees of whatever provenance seek to enhance their survival chances by moving to better-off countries.
Over the past forty years, the desert in northern Sudan has moved 100 kilometres towards the once fertile south. The causes are, on the one hand, steadily decreasing rainfall and, on the other, the overgrazing of grassland, deforestation and ensuing soil erosion that makes the land infertile. Forty per cent of Sudan’s forest has been lost since the country became independent, and at present a further 1.3 per cent is vanishing each year. For many regions, the United Nations Environmental Programme foresees total deforestation within the next ten years.
Climate models for Sudan point to a temperature rise of 0.5 degrees Celsius by 2030 and 1.5 degrees by 2060, while at the same time rainfall will decrease by an annual average of a further 5 per cent. This would mean a decline of 70 per cent in the grain harvest. Some 30 million people live in northern Sudan, and to appreciate what these figures mean we need to bear in mind that the country is already one of the poorest in the world; it also faces major ecological dangers, and a civil war has been simmering for the past half-century. There are 5 million so-called Internal Displaced Persons (IDPs), who have been systematically driven out of their native villages. Hostile militias not only kill people but also burn villages and forest, in order to prevent the return of those they ‘displace’.
Most IDPs live in camps with virtually no infrastructure: no electricity, no sewer system, no running water, no medical care. The food supply is largely provided by international aid agencies. In order to cook, people there have already cut down all the forest for as much as 10 kilometres around. The bare land is dangerous: many women are raped and killed on their way to fetch wood. They are not robbed, because they have nothing anyone could take.
The Darfur region in the west presents a similar picture, perhaps even worse since fighting has spilled over the border from Chad and the Central African Republic. There are another 2 million IDPs in Darfur, most of whom live in rough and ready camps on the edge of large settlements and towns. In some areas the population has swollen by as much as 200 per cent since the outbreak of open war. The United States and the EU have been unable to agree whether it should be described as genocide, but somewhere between 200,000 and 500,000 people have been killed so far.
Sudan is the first case of a war-torn country where climate change is unquestionably one cause of violence and civil war. It used to be assumed that the violent effects of climate change were indirect, but where survival is at stake even small shifts can acquire explosive force. Then it is a question of struggle for existence. In a country where 70 per cent of the population lives on and from the land, there is a real problem if pasture and arable land begin to disappear. Nomadic herdsmen need pasturage for their animals, just as small farmers need land to grow cereals and fruit for themselves and their families. When the desert expands, livestock breeders use the land of farmers, or vice versa. There is a critical threshold below which survival interests can be asserted only by force.
From 1967 to 1973, and again from 1980 to 2000, Sudan suffered a series of catastrophic droughts, which were one reason for the major population movement and thousands of deaths from starvation. Of course, apart from ecological factors, there are many other causes of conflict – so many, in fact, that an attempt to present a historical overview leaves one feeling helplessly confused.7 Varying in intensity and geographical location, war has marked the country’s life throughout the half-century since 1955. Only between 1972 and 1983 was there a fragile state of peace. An agreement was finally signed in 2005 that ended hostilities in the south, but since 2003 war has been raging in Darfur, in the west of the country. Everyone agrees that the war situation is disastrous – though little or nothing is said about the shortage of drinking water, the catastrophic floods, wastewater contamination and rubbish mountains, or the environmental destruction caused by expansion of the oil industry. There is a direct link between climate change and war. To look at Sudan is to look into the future.
In Western countries, the public agitation that followed the three reports of the Intergovernmental Panel on Climate Change (IPCC) in early 2007 has since subsided. Yet, if anything, the global scenarios have become grimmer. We now know that some parts of the world will be winners as a result of climate change, since they will become more favourable for agriculture and more attractive as a holiday destination. Hotel managers on Germany’s North Sea coast are happy enough, and the potential for wine-growing is expanding ever northward. The Stern Report, which considered the economic implications of climate change, seems to have caused a moment of horror, only to be rapidly followed by contemplation of the new economic prospects for the technologically advanced countries.8 Lord Stern, the World Bank’s former chief economist, calculated that the costs of unrestricted global warming would amount to between 5 and 20 per cent of world income (probably towards the upper end of this range), whereas the stabilization of CO2 emissions until the year 2050 would cost no more than 1 per cent of GNP, well within the capacity of normal economic development to absorb it.
Naturally there are variations between branches of the economy: producers of renewable energy would benefit, while ski tourism would lose out. But all in all an immediate change in climate policy is thought to offer an economic opportunity for the West. Improved energy production, more efficient appliances of every kind, hybrid vehicles, biofuels, solar panels and much else besides promise a rosy future. There is even talk of a ‘third technological revolution’, although this overlooks the fact that the first and the second are the causes of today’s problems.
Citizens show an environmental awareness when they use aircraft with a bad rather than a good conscience. But thinking about climate change can lead to unexpected reactions. Car drivers might go for a more powerful model than they originally intended, for the simple reason that the time for twelve-cylinder SUVs with 500 HP might soon be over.9 So-called climate and sustainability funds advertise themselves with the argument that climate-related lines of business fare better in the long run than the equity market as a whole. Nor are the benefits only financial; it gives investors a good conscience to feel that they are doing something ecologically useful.10
What do such examples show? They show that people adapt to new environmental conditions – and that the adaptation may be rooted not in a general change of behaviour but in a modified perception of the problems. A study was published in 2005 on how fishermen relate to the continual decline in fish stocks in the Gulf of California. Despite sharp falls in the fish population and massive overfishing, the younger men tended to be less concerned than their older colleagues, because they no longer knew how many kinds of fish used to be caught in abundance off the coast.11
One can see the coming climate problem as an opportunity here and now, as a vague distant possibility or as a matter of no consequence, positioning oneself accordingly in relation to the diffuse threat. As in the case of the southern Californian fishermen, people’s perceptions change within the changing present of which they are part, and when, nonetheless, dissonances arise there are many and various ways to overcome them. For that it may well be enough to have an awareness of the problem, which creates the sense that one is not indifferent, heedless or powerless in relation to it. One then changes one’s attitude to the problem, not to its root cause.
We also need to realize that attitude and behaviour are two different things, linked to each other only loosely, if at all. One can have attitudes independently of the situation, beyond any reality checks or decision conditions, but actions generally take place under pressure and are determined by the requirements of the situation. This is why people often act in ways that contradict their attitudes – although, interestingly, they rarely have major difficulty in integrating such contradictions. One then compares one’s own behaviour with the still worse behaviour of others, finding it absurdly trivial beside the scale of the problem or deciding to act differently in the future. All this serves to reduce the dissonance between what is morally recommended and what one actually does.12
Such dissonance reduction is not unimportant: it may be effective even in extreme situations – for example, when people are asked to kill other human beings and find it difficult to reconcile their task with their moral self-image. In a study of mass murderers, I tried to show how the men in question manage to bring murder and morality into harmony with each other.13 They do this by operating within a mental frame of reference that allows no doubts to form about the necessity and rightness of their actions.
In Nazi Germany such men acted in groups, far from their usual social contexts, developing and ratifying common norms that were not subject to external criticism. They acted in what we might call ‘total situations’,14 without the heterogeneity of everyday life in which changing roles, social contacts and demands exert a corrective or conflictual influence on one another. Even killing was regarded as a necessary task, but it caused them considerable problems, since the murder of defenceless men, and especially women, was totally at variance with their customary self-image. However, precisely because they could think of themselves as men with a task they were expected to fulfil, they were able to reconcile their grisly work with their moral image of themselves as ‘good guys’.15 For this reason, scarcely any of them developed massive feelings of guilt in later years, and most were able to integrate successfully and inconspicuously into postwar German society.
The most striking and depressing feature of the testimony given by SS mass murderers is that there is never any personal acceptance of guilt, only pointed remarks to the effect that they were put in the position of having to do gruesome things against their will and feelings – and that they themselves suffered from this. Here we find a hint of Himmler’s ethic of ‘decency’,16 which in its time not only directed action but enabled the perpetrators to see themselves as human beings capable of suffering from the unpleasant aspects of their work. After the war, this self-perception underpinned the biographical seamlessness that so strikes the reader of their statements.
Such examples of extreme violence show that, for people in real situations, what is in principle decisive are not the objective circumstances but their perceptions of reality and how they interpret them. Only interpretation leads to a final conclusion, and then in turn to action. Hence an action that appears from outside to be totally irrational, counterproductive or purposeless may be highly meaningful for those who perform it, even when they are damaged as a result. Mohammed Atta, for example, saw a point in flying an aircraft into the Twin Towers, and the Red Army Faction terrorist Holger Meins in starving himself to death in a prison hunger strike. Overly rational images of human beings, such as those which underlie many action theories, allow no space for such forms of particular rationality. Only when we investigate how individuals perceive reality can we understand why they draw conclusions that appear from the outside as completely bizarre.
This may also give us more insight into the peculiar fact, on the one hand, that there is little scientific doubt about the danger of climate-induced breakdown facing many societies in the years or decades to come, and, on the other, that no one really believes it.17 There are a number of weighty reasons – apart from the remarkable capacity of human beings not to be troubled by contradictions in their behaviour – for this curious form of ‘apocalypse blindness’ (Günther Anders). The most important is the complexity of modern action chains and the incalculability of their consequences. Zygmunt Bauman calls this phenomenon ‘adiaphorization’: the disappearance of responsibility as a result of the division of labour in human action.18
One prerequisite of responsibility is that the parameters for methodical action are known. In modern, functionally differentiated societies, with their long action chains and complex webs of interdependence, it is in principle difficult for individuals to relate the eventual consequences of their actions to the controlled acts of volition for which they can take practical responsibility. For this reason institutions such as courts, mental hospitals or advisory centres have arisen to facilitate and regulate the production of such a relationship – which has a dynamic of its own, since here too a division of labour may mean that, as Heinrich Popitz put it, lack of staff competence ‘fatally [compounds] lack of involvement on the part of those whose cases are being dealt with. The two together lead to the smooth excesses of indolence that we know.’19
The problem of fading responsibility thus appears hand in hand with processes of social modernization, as a kind of price for the development or rebuilding of the institutions in question; responsibility is converted into competence, and hence automatically also into incompetence. Perhaps more serious, however, is the fact that people can take responsibility only in so far as the temporal relationship between an action and its consequences allows for them to be held responsible. If the cause-and-effect relationship does not stretch beyond the lifetime of the actors involved, then one finds a certain kind of limited allocation of responsibility, such as the verdict of the International Court of Justice that, although Serbia did not commit genocide against the Bosnian Muslims, it failed to intervene to prevent it. Other examples would be damages for product liability, as well as certain areas of criminal law, insurance law, and so on. In each case, the question is how far someone was responsible for the consequences of an action and how far he could have anticipated them.
But what if the person who brought about the consequences of an action cannot be held responsible because he or she is no longer alive? In company law this problem is regulated by the recognition of a legal successor,20 but the same does not apply when private individuals are involved. Moreover, things become considerably more complicated in the case of climate change, since the causes of the problems looming today go back at least half a century and could not have been foreseen in the state of scientific research at the time, and since strategies in the present to deal with consequences that cannot yet be anticipated may themselves have highly uncertain consequences that stretch into a distant future. The relationship between action and consequence is here trans-generational and can only be foreseen through the mediation of science. Its imperceptibility decreases the motivation for action and does not make it easier to allocate the blame for today’s problems.
For it would logically imply holding a forty-year-old person responsible in 2012 for a problem whose causes predate his birth and whose solution can only come after his death – a person, therefore, who can have no direct influence on either the causes or the solution. Although he should be required to deal responsibly with the problem, it must be asked whether he is able to do this in any traditional sense and, if so, what form such responsibility might take in practice.
This question has an important bearing on democracy. What does the blurring of cause–effect chains imply for policy decisions and the general development of political awareness? How does the inbuilt lack of responsibility influence perceptions of the social consequences of, and possible solutions to, climate change? Which solutions that are now unthinkable to us will we consider possible in a few years’ time?
In the first third of the eighteenth century, when no one could imagine that in a couple of centuries the ‘modern age’ would convert its ideals of progress, rationality and efficiency into industrial mass murder, Jonathan Swift put forward a proposal for the eradication of poverty in Ireland. As things stood, the statistics showed a constant increase in the numbers of the poor, as well as a disproportionately low return for the economy from the sums invested in each child. Neither the children of the poor nor their parents would any longer have to face a grim existence of hunger, theft and begging or become a burden to others; ‘instead of being a charge upon their parents, or the parish, or wanting food and raiment for the rest of their lives, they shall, on the contrary, contribute to the feeding, and partly to the cloathing of many thousands.’ His solution?
I do therefore humbly offer it to publick consideration, that of the hundred and twenty thousand children, already computed, twenty thousand may be reserved for breed, whereof only one fourth part to be males. […] That the remaining hundred thousand may, at a year old, be offered in sale to the persons of quality and fortune, through the kingdom, always advising the mother to let them suck plentifully in the last month, so as to render them plump, and fat for a good table. A child will make two dishes at an entertainment for friends, and when the family dines alone, the fore or hind quarter will make a reasonable dish, and seasoned with a little pepper or salt, will be very good boiled on the fourth day, especially in winter.
Swift then lists a whole series of positive effects that would ensue if children were treated as raw material for trade, gastronomy or the tanning industry; there would even be moral benefits, since abortion and infanticide would decline. In conclusion, he insists that he has no other motive in putting this forward than ‘the publick good of my country, by advancing our trade, providing for infants, relieving the poor, and giving some pleasure to the rich’.21
This ‘modest proposal’ is not Swift’s best-known satire, and its eerie quality comes from the way in which it logically develops a solution unthinkable in the moral climate of the West at the time. Given the twentieth-century rationalizations of genocide, however, complete with statistical material and moral arguments, we might think that Swift looks ahead to a time when all morality is no more than a residual category, which serves at best to reassure people before and after they act but sets no limits to their inhumanity.
The modern age has already witnessed many radical solutions to perceived social problems; how far these can go is shown by the ‘final solution of the Jewish question’, which involved straightforward annihilation of the Jews. Although we know from Turkey, Germany, Cambodia, China, Yugoslavia, Rwanda and Darfur, and from many instances of ethnic cleansing,22 that radical solutions are always an option even for democratic societies, there is a tendency to interpret mass homicidal processes as a departure from the ‘norm’, as ‘special cases’.
The few attempts to reverse the perspective, by asking what such catastrophes actually mean for the theory of society, have remained marginal and lacking in influence, whether philosophical (Günther Anders or Hannah Arendt, for example) or sociological (Norbert Elias or Zygmunt Bauman) in their approach. It is true that the sociology of catastrophe has found its way into homeland security thinking, but it has little place in social theory and is rarely met in either historical or political theory.
The social catastrophes of the twentieth century have clearly shown, however, that ethnic cleansing and genocide are not a deviation from the path of modernity but possibilities that first arose with the development of modern society. From this point of view, such cataclysms as the Holocaust should be understood not as a ‘breakdown of civilization’ (Dan Diner) or a ‘relapse into barbarism’ (Horkheimer and Adorno) but as a result of modern attempts to produce order and to solve perceived social problems. As Michael Mann shows in an extensive study, ethnic cleansing and genocide are closely linked with modernization processes, even though one would not think so at all from their displays of seemingly archaic violence. The same might be said of Islamic terrorism, which is a reaction to modernization and therefore closely, if negatively, bound up with it.
In Modernity and the Holocaust, Zygmunt Bauman explained why the Holocaust has never become a systematic object for social science: first, its perception as an event in Jewish history has defined it as a pathological, rather than normal, problem of modernity;23 and, second, it has been attributed to an unfortunate set of circumstances, which, though not so explosive in separation and usually tamed within the social order, came together disastrously in interwar Germany. If, instead of reassuring themselves in this way, sociologists had methodologically studied the phenomenon, they would have discovered industrial mass extermination as a ‘test case’ for the latent potential of modernity, which provided new insight into its character and motive mechanisms. Bauman notes the paradox that ‘the Holocaust has more to say about the state of sociology than sociology in its present shape is able to add to our knowledge of the Holocaust.’24 He therefore argues that the Holocaust should be seen as something like a ‘sociological laboratory’, in which ‘attributes of our society’ are revealed which ‘are not empirically accessible in “non-laboratory” conditions’.25
Hannah Arendt impressively incorporated modern institutions such as the concentration camp into the theory of society.26 In her account, the camps show that totalitarian societies and the dynamics of force create new realities, in which the actors in question can integrate particular rationalities that otherwise appear meaningless or deranged into comprehensive semantic systems. The standard instruments of social science, geared to rational models of behaviour, are not calibrated for the explanation of such systems.
Faced with such problems, historians read meaning back into events that may not have been there for people living at the time. One reason for this is that social history takes its bearings from the concept of understanding used in the human sciences, which involves ‘empathetic observation of an earlier state of culture’ and has ‘its roots in an idealistic, culturally optimistic, conception of history’.27 This kind of understanding proves ineffectual in relation to the criminal activities of modern totalitarian regimes, because the reality it has to deal with there is not understandable in any conventional sense.
Nazi extermination policy took over from colonial warfare a variant of killing that did not simply eliminate individuals regarded as superfluous or damaging, but extracted maximum utility in the form of ‘extermination through labour’. Thus, in the construction of giant underground facilities for the production of V-2 rockets or Me- 262 jet fighters, prisoners were worked so hard that they could expect to live no more than a few months. Labour could be used simultaneously both for exploitation and as a means of killing because there would always be a fresh batch of people who could be worked to death.
This naturally required planning and execution, so that killing in turn became work. ‘Extermination through labour’ had to be organized logistically and technically; there had to be a camp, with prisoners’ barracks, sanitary facilities, staff accommodation, means of transport, electricity, water, railway tracks and wagons, and so on. For engineers and architects involved in this infrastructural development, extermination took on the form of a complex, business-like activity, with all the professional skills and efficiency concerns found in other contexts. This was also apparent in the organization of mass murder, such as that which took place in conquered Russian territories after 1941; here the normalization of killing meant that it was seen as a job like any other, and the need for professional problem-solving became part of an overall context of systematic tasks. The process was based on a division of labour: no one actually had to feel like a murderer, even though the killings were direct acts, not involving remote-controlled techniques such as gas chambers.
In the Nazi war of extermination, killing appeared rational to the perpetrators precisely because they saw it as ‘dirty work’, which could even be distressing to execute. The pressures that these ‘necessary’ duties placed on those who performed them were, as we have seen, a constant theme in Himmler’s speeches as well as in their own conversation. Their very distress spared them a murderous self-image, both at the time of the killings and after the war was over. Instead, they were able to embed their acts in a meaningful frame of reference: ‘I kill in the service of a higher cause’; ‘I kill for future generations’; ‘I kill differently from other people’; ‘I get no joy from this work’. It is a psychological mode that makes people capable of doing unimaginable things – of doing just about anything, in fact. Human action, unlike that of creatures incapable of self-consciousness, is not subject to any instinctual or constitutional limitations.
Human beings exist within a social universe and should therefore consider everything to be possible. There are no natural or other limits on their behaviour, even when – as in the case of suicide bombers – it puts an end to their own life. We should therefore consider it sociological folklore when impressive-sounding anthropological claims are made that people develop hunting instincts, flock together and experience a lust for blood. In reality, violence has historically and socially specific forms, and it takes place in equally specific contexts of meaning.28
In the Nazi period, the higher meaning of killing was to help the racially pure society to achieve world domination. Rapid advances in technique led to the detachment and displacement of violence, replacing mass shooting with special extermination camps; the perpetrators no longer killed with their own hands, but transferred the task of killing to technology and the task of processing the dead to so-called functional prisoners. Zyklon B, in specially designed gas chambers, made it possible for the killers to exterminate without using violence directly.
Commemorations of the Holocaust are always associated with the idea that we can learn from history, that historians can provide the knowledge required to ensure that what happened then ‘never happens again’. But why should it ‘never happen again’, when the evidence shows that human beings – even those of unquestionable intelligence and with a liberal education – are able to find meaning in the most anti-humanist theories, definitions, conclusions and actions, and to integrate these into their familiar conceptions of the world?
When we look at the wide historical panorama of violence and of people willing to kill, should we not assume that the Holocaust made it more, not less, likely that such things will happen again? In 1994 the majority in overpopulated Rwanda thought it made sense to kill 800,000 Tutsis in the space of three weeks. It is a modernist superstition that allows us to keep shrinking from the idea that, when people see others as a problem, they also think that killing them is a possible solution. This often has less to do with aggressiveness than with purposive thinking. According to Hans Albert, the production of weapons has ‘in many cases been more useful than the production of tools’ for problem-solving.29 So, where does that leave us with ‘learning from history’?
I also want to mention a very difficult subject before you here, completely openly. It should be discussed amongst us, and yet, nevertheless, we will never speak about it in public. […] I am talking about the ‘Jewish evacuation’: the extermination of the Jewish people. It is one of those things that is easily said. ‘The Jewish people is being exterminated’, every Party member will tell you, ‘perfectly clear, it’s part of our plans, we’re eliminating the Jews, exterminating them, ha!, a small matter.’ And then along they all come, all the 80 million upright Germans, and each one has his decent Jew. They say: all the others are swine, but here is a first-class Jew. And none of them has seen it, has endured it. Most of you will know what it means when 100 bodies lie together, when there are 500, or when there are 1,000. And to have seen this through, and – with the exception of human weaknesses – to have remained decent, has made us hard and is a page of glory never mentioned and never to be mentioned. Because we know how difficult things would be, if today in every city during the bomb attacks, the burdens of war and the privations, we still had Jews as secret saboteurs, agitators and instigators. […] We have the moral right, we had the duty to our people to do it, to kill this people who wanted to kill us. […] But altogether we can say: We have carried out this most difficult task for the love of our people. And we have taken on no defect within us, in our soul, or in our character. (www.holocaust-history.org/himmler-poznan/index.shtml)