After the death of its leader John Smith on 12 May 1994, the Labour Party had the chance to change direction completely and make itself electable. The only way that this could possibly happen was for it to abandon its left wing, and to become a party of Social Democracy, with a plausible economic policy, largely indistinguishable from that of the Conservatives. Hence, New Labour was born. There was more than one architect for the idea. Director of the London School of Economics Anthony Giddens, the author of The Third Way, was perhaps the theorist. Peter Mandelson had been the spin doctor. But the two heavyweight politicians who embraced the idea and made it their own had been Tony Blair and Gordon Brown. One of them was bound to succeed John Smith and put the Third Way into practice.
Smith’s death had taken Brown by surprise, and he had been even more knocked off course by the speed and ruthlessness with which the Blair camp, a word which seems rather too apt, was directed by Mandy. By the time Brown had assembled his supporters in the House of Commons, he found out that the Blair bandwagon was unstoppable, and he felt ‘betrayed, devastated’.
He did not have the courage to run against Blair. Had he done so, there were those, chiefly his Scottish fellow countrymen, who believed he would have stood a good chance of winning. But it was not in Brown’s nature to take risks. ‘Prudence’ was his favoured fiscal and economic policy, and luckily for him it coincided with a period of stability in the world markets and economies which lasted for most of the time he was Tony Blair’s Chancellor of the Exchequer. At first, licking his wounds, Brown directed his hatred against the mincing and absurd figure of Mandy. After they came into power, however, with Tony as Prime Minister and Gordon as Chancellor, Brown was unable to conceal his hatred of the Blairs, and their mutual detestation made for one of the most peculiar spectacles which public life had known for decades. The public had grown used to stories of Margaret Thatcher’s Cabinet finding her difficult. But such legends paled beside the implacable, relentless feud which continued, in season and out of season, between the increasingly mad-seeming Blair and the not-obviously-much-saner Brown, with his dour manners and a tongue disconcertingly too large for his mouth.
The extent of the bad feeling between Brown and Blair astounded all who encountered the two men for the first time. A marked feature of it was the filthy language used by the Vicar of St Albion’s (‘I’m going to take no more shit from over the road’–i.e., from Brown). Brown believed, or let it be believed, that Blair had entered into a ‘gentleman’s agreement’ over their ‘dual premiership’ that he would stand down and allow his former friend, now his bitter enemy, to take over. Blair and the Blairites denied that any such ‘gentleman’s agreement’ had ever taken place–as how could it between these non-gentlemen? One shrewd political observer, Andrew Rawnsley,1 saw them as like the terrible twins Esau and Jacob, vying for power while even in their mother’s womb. When Blair was talking big to his supporters, he pretended that he was on the verge of sacking Brown. Given the disloyalty of Brown and his acolytes, it was undignified of Blair not to have sacked him. But he did not dare, partly because Brown’s power base in the Cabinet was so great, and partly because, for much of the decade of Blair’s spell in office, it was indeed a dual monarchy, with Brown balancing the books in his counting house and Blair strutting the world stage. Brown, equally cowardly and just as painfully, locked into an abusive relationship which caused both parties more pain than delight, was too weak to resign or to stand openly against his leader.
Blair behaved towards Brown as Churchill had behaved towards Eden, endlessly teasing him with the possibility of his taking over, but never quite finding the moment when he was prepared to relinquish power. Brown and his friends had to wait before Blair had become extremely unpopular with the electorate (after the Iraq War) before they managed to stage a coup and extract his promise to stand down. Blair waited until he had done over a decade at Number 10, and then retired from office in order to convert to Roman Catholicism and to pursue a bewildering range of activities. Gordon Brown was at last able to hold the reins of power.
In the first few weeks he was popular. There was a palpable feeling of relief, both in the country and in the party, that Blair’s hammy stage-acting had been replaced by the earnest seriousness of the Presbyterian dominie. But after only a few weeks, Brown became unstuck. First, at the Labour Party Conference, he made an ill-judged speech whose populism was at variance both with his serious character and the facts of the case. (He promised, for example, ‘British jobs for British workers’–but how was this to be achieved in the current state of the European Labour Market?) His supporters, egged on by their cheerleader Ed Balls, now an education minister, urged him to capitalise upon his popularity and win a mandate from the electorate by going to the country. Fatally, Brown allowed it to be known that he was tempted by the idea and he took soundings, rather than quashing immediately the rumours that such an election might be called in the autumn of 2007. By the time he announced that there was to be no election, he had managed to look both opportunistic (considering such an election in the first instance) and cowardly (backing down when it looked probable that he stood a chance of losing).
Thereafter, Brown was revealed as a man, like the Prince of Wales, cursed with the one quality which makes public life unendurable: bad luck. For the first time in British history since the 1860s, there was a run on a bank: Northern Rock. Old Labour would have nationalised it on the spot. Anyone else of any sense would have allowed it to go bust. It had been borrowing and lending money it could not afford. But Brown’s Chancellor, his former Treasury sidekick Alastair Darling, who was obviously His Master’s Voice, dithered. They dithered with the taxpayers’ money, eventually shelling out more to protect the shareholders and potential new owners of the bank than they had spent in the previous year on the entire military budget. Coming on top of a number of minor scandals, in which it became clear that his Cabinet colleagues had been vague about declaring donations when they had stood as his deputy leader, the public had decided that Gordon Brown was a less attractive proposition than he had seemed when he first came to office.
In the days of the Lilliputian battles between the Blairites and the Brownites, whose petty victories, on either side, were trumpeted in the columns of their fawning acolytes in the press, Brown was often described as a control freak. But economists, who are nearly always determinists of one sort or another, do not really believe in human beings, let alone Prime Ministers, controlling destiny. Andrew Glyn, for example, the Oxford Marxist, spoke in Capitalism Unleashed of the economic cycles of the previous half-century as if they were meteorological conditions over which human beings could exercise no restraint. In the 1950s, when the British had ‘never had it so good’, there was an unprecedented economic boom; there was low unemployment, inflation was easily containable and living standards expanded prodigiously. But this very prosperity, according to the Marxist analysis, led to a strengthening of the labour force and a high demand by larger numbers of people globally for finite energy sources. Therefore everything began to unravel. The price of oil went up, labour relations worsened as workers demanded to be compensated for inflation with yet more inflationary wages. Unemployment, at a level unseen since the 1930s, was the inevitable consequence. Then came the inevitable Reagan–Thatcher reaction. Interest rates were raised to punitive levels to curb inflation. Privatisation and deregulation freed the financial markets. Exchange rates floated. The economy revived itself. But no sooner had these inevitable decisions been taken than there was a cascade of surplus savings from the fast-growing economies of Asia. During this period, credit was seemingly unlimited. Unregulated by governments or by anyone else, the banks could lend money recklessly. With the advent of another recession, this was bound to be corrected by ruin.
All Gordon Brown’s ‘prudence’ and control came to nothing as the world cascaded into recession. If he had allowed Northern Rock to collapse, the matter would have been a sorrow for its investors, but, instead, Brown and Darling committed themselves to a rescue which would cost every household in Britain the equivalent of a mortgage of £2,000. This would not be forgotten. Though the determinists might be right about the inevitability of economic cycles, electorates look at politicians and assess their steadiness under fire. Gordon Brown failed this test.
A central contention of Edward Gibbon’s History of the Decline and Fall of the Roman Empire is that the essentially noble, civilised and pagan classical world debased itself by embracing the superstitions (as Gibbon saw them) of Christianity. When he reaches the life and career of Simeon Stylites (395–451), Gibbon abandons his normal irony and expresses his open contempt for the heroes of Christian monasticism. Contrast the writings, and lives, of the great Romans, with the ludicrous antics of the desert fathers, and in particular Simeon, who spent his life torturing himself on top of a pillar, and Gibbon’s urbane, rationalist point is made: ‘If it be possible to measure the interval, between the philosophic writings of Cicero and the sacred legend of Theodoret, between the character of Cato and that of Simeon, we may appreciate the memorable revolution which was accomplished in the Roman empire within a period of five hundred years.’2
Our span, in this book, has been shorter: a mere fifty-five years. It is tempting, however, to make similar comparisons as we appreciate the ‘memorable revolution’ which has taken place in Britain. We began the story with a Prime Minister in decay; but that Prime Minister was Winston Churchill. To compare even an ancient, sick, collapsed Winston Churchill with Gordon Brown would be a cruelty. But the comparisons would not get us anywhere. Go back a hundred years, however, to the time when Churchill was President of the Board of Trade in 1908, and the contrast will be sharper. Then we should find a political class composed of first-rate intellects; and in the fields of literature, music and the visual arts a similar level of excellence. Then you would be comparing the England of Lord Salisbury, Arthur Balfour and Lord Morley, with the England of Ed Balls and Jacqui Smith; the England of Edward Elgar with that of Harrison Birtwistle; the England of William Nicholson with the England of Gilbert and George; the England of Henry James, Joseph Conrad, Thomas Hardy and George Meredith, with that of Ian McEwan and Martin Amis. Then Great Britain was the greatest power in the world. Compare it with the Britain of 2008, and the language of decline and fall becomes inevitable. The two world wars ruined Britain in much more than a financial sense. Culturally speaking, as we can now see, Britain had declined beyond redemption before the period covered by this book. We could draw Gibbonian comparisons between 1952 and 2008. For example, we could compare Evelyn Waugh, still writing novels when this book begins, and whichever contemporary novelist you would like to choose. It would be possible to continue in this vein, and to make a rhetorical case for the passage of the last half-century of British life having been a decline and fall; possible, but unhelpful. The Britain which saw Elizabeth II’s Coronation and the Britain which will see her funeral are in reality two different, equally awful, places.
Coronation Britain was certainly not a very happy place, and it is doubtful whether even the most conservative inhabitants of Brown’s Britain would truly rejoice if transported back to 1952, to a country where divorce was punitively difficult, where homosexuals could be imprisoned, where olive oil was sold in tiny phials at the chemist, and where table wine turned your teeth black. In all material senses, the Britain of 2008 is a much more plentiful and a much more interesting place than the Britain of 1952. Nor has there been a ‘decline’ in the sense that Gibbon chronicled in the history of the Roman Empire. Gordon Brown might not be Winston Churchill, but the politicians of the 1950s were not noticeably more impressive individuals than those of the 2010s. As for the general amenities of life, decline is, once again, not what we have to describe. True, British towns became uglier, smellier and more overcrowded in the period covered by this book; but compare the following in 1952 and 2008–restaurant meals, in London and the provinces; provincial hotels; opportunities to hear operas and concerts generally; range of broadcasts; cost, and availability of telephones; price and range of raw food, and awareness of food and of dietary health; the standard of dentistry; the range and potentiality of pharmaceuticals and of surgical skills; the life expectancy of the old and of babies; the care for the disabled and the elderly, both residential and at home; the opportunities for higher education; the possibility of living, without the censure of the law or of your contemporaries, as a homosexual, a lesbian, a transsexual; the possibilities of cheap travel, anywhere in the world… When you consider all these things, it is impossible to think of the last half-century as a decline. But at some stage along the journey, Britain ceased to be a society. Margaret Thatcher spoke in simplified terms to a woman’s illustrated weekly, but her contention that ‘there is no such thing as society’ became true in our times. There was a multiplicity of societies, all wondering how they could best live together without actual hostility breaking out between them.
One very obvious indication of how confused the governments of the twenty-first century felt about being British was to be found in their immigration policies. In 2005, the Home Secretary introduced a system whereby potential immigrants to Britain would be assessed upon their usefulness. ‘High-skill workers’–doctors and engineers, for example–could enter Britain without a job offer. Those at a slightly lower level of skill, for example teachers and nurses, would be allowed to come, but only for a limited period, and only if the labour market was short of their particular skills. Those who applied for permanent residency would be required to pass simple English language tests.
Far from being reassured by these measures, most members of the British public were horrified to have it so clearly spelt out that such simple entrance requirements had not been in place for the previous fifty years. While all this tough talk was emerging from the Home Office, the news had broken that the government actually had no idea how many illegal immigrants there were in the country, nor how many escaped asylum seekers, nor how many criminal immigrants who had been released, or who had escaped, from prison. With all its love of statistics and minding the business of the law-abiding and tax-paying population, the government could not keep tabs on migrants. And no wonder, since, quite apart from the numbers of asylum seekers and immigrants from the rest of the world, Britain had signed up to the enlargement of the European Union. When this measure was introduced in 2004, France, Germany, Italy, the Netherlands, Spain–indeed all the bigger and richer countries in Europe–saw that to allow unlimited movement of workers from Eastern Europe would create chaos in the labour market, not to mention putting intolerable burdens on services such as hospitals, public transport and housing. Sweden and Ireland had their own reasons for not fearing an overwhelming invasion of migrant workers, and they alone (apart from Britain) allowed open entry to migrant workers from the new member states. For Britain to do so was little short of insanity. ‘Far more workers came into the UK than was originally anticipated,’ admitted the New Labour guru Anthony Giddens3: around 420,000 from Eastern Europe, this on top of the numbers of asylum seekers and illegal immigrants in the same period. By 1 January 2007, the government had changed its mind about an ‘open-door’ policy, and decreed that Bulgaria and Romania, when they joined the Union, would not be granted the same privileges of unlimited access. The EU expressed its ‘disappointment’ at the British decision, but made no suggestion as to where the extra buses, extra underground trains, extra flats, extra hospital beds and extra schools were to be found to accommodate all these extra people.
No doubt the Polish plumber and the Czech bricklayer were cheaper than their British equivalent. It was strange for a Labour Party–even a New Labour Party–to be so blatant about importing cheap labour on this scale, with the inevitable depletion in wages and living standards for the majority. No doubt it was also embarrassing to see how much more hard working and how much more skilful were the workers trained under communist regimes than under fifty years of ‘consensus’ politics. Certainly, the arrival of the Poles, particularly in London, with their eager, intelligent faces, their willingness to work fifty-hour weeks mending the lavatories and building the kitchens of the British middle classes, made the indigenous poor seem all the more pathetic.
The underprivileged of Brown’s Britain were not poor as the inhabitants of Victorian slums had been poor. But because they ate the consoling junk food beloved by American proles, they came to resemble them, waddling from Iceland to Burger King or Dunkin’ Donuts in their huge blue jeans, pushing their obese tots in groaning strollers. The intelligent among these tots stood less chance than their British working-class equivalents in 1952 of rising through education and shaking off the constraints of their upbringing; this was partly because their upbringing and education (constant television, computer games, overcrowded inner city schools where few graduates were brave enough or unworldly enough to want to teach) was unlikely to train them in the gifts of concentration which would make such a life change possible.
In 1944, when drafting his Education Act, R. A. Butler recognised that there would be a strong case for abolishing private education. Had he done so, and had at least a proportion of the good teachers from the private sector remained in the system, and had that system resisted some of the educational theories which plagued teachers’ training establishments from the 1960s onwards (for example, the abolition of phonic techniques when teaching children to read), then it might have resulted in a more cohesive society. Every now and then, even late in our times, an old optimist would surface, wishing that the iniquity of division could be abolished. Alan Bennett, for example, when in his seventy-fourth year, called for private schools to be brought to an end. ‘It is the paying. It is the fact that you can buy advantages for your children over and above their abilities, which seems to me to be wrong. It’s a fissure that runs right through English society and you don’t get that in France. In France, state education is the best. It should be the same here. If the state schools were the best, if you had to compete to get into them and their education was better than what was on offer privately, then the whole nature of education would be transformed.’4 Such an optimistic viewpoint presupposed that the same educational values and ideas would have obtained in 2008 as were current when Bennett himself enjoyed the benefits of an old-fashioned grammar school education in the 1940s. When Antony Crosland expressed the wish to abolish ‘every fucking grammar school in England’, however, he ushered in a mob of theorists who questioned the very standards of excellence which enabled schoolchildren, regardless of income bracket, to prosper. Alan Bennett’s notion of clever children competing with one another to get into the ‘best’ schools would have been anathema to the egalitarians of the 1970s, when Iris Murdoch, for example, a lifelong leftist, abandoned her commitment to Labour after streaming was abolished in state schools and the theorists insisted upon mixed-ability classes for languages and mathematics. As she said at the time, ‘you don’t have mixed-ability football teams’. Had Antony Crosland and Shirley Williams made the state schools better than the private sector (as in France), then there would have been no problem, since the great majority would then presumably have patronised the state schools. The fact that so many parents choose to subsidise state schools through tax, and then to spend a large amount of their taxed income on private education, was not a sign of their innate selfishness. It was an indictment of the gross incompetence of generations of politicians.
In any case, what might have been an economic or political possibility in 1944 was never going to be a possibility in the twenty-first century. As many as a quarter of London primary school children by this date were educated privately. One-tenth of the schools in England, Wales and Scotland were private (2,261 schools compared with a little over 20,000 primary and secondary schools in the maintained sector5). Even if the political will and courage had existed to abolish ‘every fucking’ private school, in the way that the grammar schools were similarly abolished in a previous generation, it is hard to see how the economy could pay for the thousands of teachers, and hundreds of thousands of pupils, who would need to be absorbed into the maintained system. Yet Alan Bennett and the good-hearted optimists were of course perfectly correct to say that the division between those educated privately and those not so lucky was ‘a fissure that runs right through English society’.
As Belloc wrote:
For the hoary social curse
Gets hoarier and hoarier
And it stinks a trifle worse
Than in the reign of Queen Victoria.
Optimists would dig up the drains and try to get to the source of the odour. Pessimists would think that a stink was not worse than the chaos which would thereby ensue. ‘Leave ill alone’ had, in a quite different context, been the 3rd Marquess of Salisbury’s advice a century and more earlier. At the end of our times, the good intentions of the educational theorists of the 1960s and 1970s had ended in disaster, with a higher proportion than ever of privately educated children being admitted to the better universities, and landing the better jobs. The underclass or lumpenproletariat therefore grew in our time; and its clever members had even less chance of escaping it. Its benign members could still be pampered and condescended to, from cradle to grave, by the benefits system. But the crashing boredom of life for the lumpenproletariat meant that more and more of its members sought variety through narcotic abuse, alcohol and the diversions of criminal activity which could pay for the former indulgences. The world of 1952 thought that badly behaved young people could be knocked into shape–by compulsory military service, or by corporal punishment at the hands of teachers, parents or the police. Those options were no longer open to the (in some ways) kindlier world of 2008, even if it were supposed that such methods would be effective. Though politicians and sociologists tried to persuade anyone who could listen that there were optimistic solutions to the problem, the pessimists had long since begun to behave as if British cities of the twenty-first century were like American cities of a quarter of a century before. Rather than attempting the radical, kindly solutions provided by such as the Prince’s Trust (through education, youth training or family counselling), many British people in 2008 took the ‘American’ solution–of better burglar and car alarms, more vigilance when walking in city streets after dark, and care not to stray into those shadowy areas where the wild things walked. Every now and then the news would tell of yet another knifing, shooting or strangling. Few really believed that the New Labour promise, to be ‘tough on crime, tough on the causes of crime’, was any more than an empty phrase.
The relationship between crime and narcotic abuse was evident, but the political class did not dare to draw the obvious conclusions. Prohibition of drugs in Britain had been no more effective than had been the prohibition of alcohol in the United States in the gangster era. In both cases, the criminalisation of the substance merely made the suppliers into barons, figures of power. Had any government had the courage to decriminalise drugs it would at least have been able to deprive the pushers, the pimps, the suppliers small and large, of their power and money. It would never do so, for fear of the popular press. When Richard Brunstrom, Chief Constable of North Wales, said that the recreational drug ecstasy was ‘far safer than aspirin’, he spoke no more than the simple, scientific and statistical truth.6 Yet every teenager who died as a consequence of taking an ecstasy tablet while out clubbing had died an avoidable death; and the popular press would not allow the politicians to forget it.
Regrettable as drug-related deaths were for the families concerned, it was the social menace posed by the drug users who did not die which was noticeable in early twenty-first-century Britain. The use of crack cocaine soared. From 1985 to 1987 cocaine-related hospital emergencies rose from 23,500 to 55,200.7 The need to satisfy the urge for crack was as intense as the effects of the drug itself: the resulting rise in crime was simply consequential, but, as the medical statistic just quoted shows, as well as the nuisance of a million stolen purses, smashed car windows, the vandalised phone boxes, one has to take note of the time and money at Accident and Emergency units expended by doctors and nurses on these individuals.
Nor was cannabis, the recreational drug of preference for baby boomers and the middle classes, as safe as its adepts hoped. Fifteen percent of cannabis sold on British streets in 2002 was skunk, a super-strength resin which, medical opinion said, accounted for a quarter of all cases of schizophrenia in 2007. The 15 percent of 2002 had soared to 80 percent in 2008. There were an estimated five million cannabis smokers in Britain at this date.8 That is a lot of people risking, not the slow wits and silliness of the Bob Dylan dopehead generation, but the outright and seemingly incurable madness of the skunk-minded. In such circumstances, the government’s vigilant attacks on cigarette smokers looked to some citizens like fiddling while Rome burned.
Comparable, and related, to the question of crime, and of the social alienation which it both reflects and brings to pass, is the whole story of immigration to Britain. Those who governed Britain in the first two decades covered by this book tended to be old men. They had grown up when the British Empire, whatever virtues it had possessed, was ingrained with racialist ideology. The chief thought in their minds, once the immigrations from former Commonwealth countries began, was of racial contamination. The rival ideology was the optimistic idea of multi-culturalism, which by the end of the period we have been considering had been largely abandoned by some of our social engineers. Trevor Phillips, for example, Chairman of the Commission for Racial Equality, announced in April 2004 that multi-culturalism was no longer ‘useful’ because it ‘encouraged separateness between communities’. On a day when British Muslims were holding one of their regular protests and burning the Union flag outside Regent’s Park Mosque in London, Phillips said, ‘What we should be talking about is how we reach an integrated society, one in which people are equal under the law, where there are some common values’. Against the ‘extremist’ ideas of the radical Islamists, Phillips proposed that there was an urgent need to ‘assert a core of Britishness’. Being British meant that everyone, including the Muslims, had to ‘work by the rules of British people–and that excludes terrorism’.9
The advantage of the ‘multi-cultural’ idea was that it enabled everyone to feel at home in their own language, religion, dress codes and eating habits without being imposed upon by the government. It was, after all, successive British governments who had allowed, or actively encouraged, immigration over the previous half-century; happy to do so when it provided cheap labour in an expanding economy; worried by the number of brown faces it might assemble in one place in such towns as Leicester or Bradford; and only noticing several generations later that behind the brown faces and the statistics were actually human beings with sets of beliefs, religious and political attitudes which might not sit easily with modern British secularism.
How do you impose ‘a core of Britishness’ upon people who are only British in the sense of possessing a passport, and who perhaps do not want someone else’s so-called values thrust upon them? What are these values, in any case? ‘Democracy and the rule of law’ is the answer which some would give. Yes, but…As we have seen several times in the previous pages, and in the two volumes which preceded this history–The Victorians and After the Victorians–Britain was only ever a partial democracy. Its parliamentary system and its civil service had both fully evolved long before the franchise was extended to all adults. The General Election is an opportunity for the British electorate to express preferences, and to change the make-up of Parliament, but it leaves the civil service untouched, the police and the judiciary untouched. Only established or ‘acceptable’ political viewpoints are offered at the ballot box, and those who wish, for example, to be governed by greens, by communists, by fascists, by Islamic fundamentalists or others–and this represents a substantial part of the electorate if added together–have no chance, no chance whatsoever, of seeing a candidate with their viewpoint elected to Parliament. Even the Liberal Democrats, who receive a high share of the votes, elect only a few dozen members to Westminster.
Britain remained a country governed by those who thought they knew best. In the nineteenth century this was a coalition of aristocrats and the professional classes, with a growing professional civil service. In the last fifty years, the aristocracy was slowly replaced by a different Establishment, of university graduates and career politicians, who were no less adamant that they did not need too much advice from the headstrong populace. The populace might think it wanted capital punishment, or an escape from the bureaucracy of Europe, but the governing classes always knew better.
Confronted with the spectre of Irish terrorism in the period 1970–95, this Establishment contorted itself into any number of positions until Tony Blair had the brilliant idea of giving the ‘extremists’ in Northern Ireland what they actually claimed to want: namely, power. He made them share it, a Dantean joke which worked. It would be less easy to do the same with the Islamic terrorists, since it was not in the power of the New Establishment to reinvent the Caliphate and bring to pass an Islamic world government–even if the New Establishment were to want such a thing. So it fell back on the rather lame belief that it must assert Britishness–at the period in history when it was hardest to define what so nebulous a concept might mean. A substantial number of Scots, at the identical period, wanted to break up the Union, and they look, as we bring our tale to a close, as if they will succeed. If they do so, where will it leave Wales? Britishness is apparently not so desirable a quality that all the British want to share in it.
Nor could the British fall back, as could the Poles, for example, at a time of national identity crisis, upon their ‘core values’ reflected in a shared religion. Although the world in general appeared, by the end of our period, to be becoming more religious, there was not much sign of this happening in Britain itself. Indeed, the Church of England by law established never looked more like breaking up altogether in a series of Lilliputian squabbles. To the outside world, which did not share its preoccupation with the legitimacy, or otherwise, of homosexual bishops or female priests, these appeared ever more arcane.
In October 1972, when he was MP for Plymouth, Alan Clark went to the high school to give a talk to the students.
‘A girl, a slim dedicated Marxist, asked me why I was like I was, what motivated me. “Because I am British,” I said, “because I want to advance and protect the British people.” “So, what’s so special about the British?” she answered. “What makes them so different from everybody else?” Well I could have answered that what makes them so different from anyone else, is the capacity they seem to have for producing at every level of society, people like yourself who ask a question like that.’10
Everything in history evolves, and one of the things which has evolved and changed in our times in Britain has been the concept of the nation state. It is regarded in all the circles to which Alan Clark alludes with smiling contempt or with actual abhorrence. President Woodrow Wilson’s idea, promulgated with some happy, and some disastrous, results after the First World War, was that peoples should express their collective identity by the formation of nation states. In our times, we witnessed the urge for such identity, in the Baltic states, in Central Europe, in the Balkans, in the Middle East. The concept of a Jewish homeland turned into the right of the State of Israel to exist as an independent nation. Two little strips of land on either side of it, occupied by Arabs who were formerly citizens of Egypt and Jordan (and before that of the Ottoman Empire) were deemed to be a suitable starting point for a Palestinian state, presumably as a quid pro quo. Montenegro and Bosnia and Serbia emerged in our times, not as independent states, wishing to escape the violence of history, but as would-be nations, as did Latvia, Lithuania, Poland, Czechoslovakia… And naturally enough, Ireland and Scotland had the same need to express their collective identity as nations. The dangers of nationalism, which hardly needed to be explained to Europeans after the Second World War, did not deter the Russians and the French from becoming more rather than less nationalistic as the era wore on.
But, Britain, and more especially England, was somehow deemed to be different. Although Gordon Brown spoke with eloquent lack of meaning or substance about ‘Britishness’ and ‘core vahlews’ and ‘British jobs for British workers’, voters could see through the rhetoric. He meant that if the Scottish nationalists were to succeed, his position, and that of all the other Scottish MPs at Westminster, would be untenable.
What he actually thought about Britain was revealed in a symbolic action which had been performed before he became the Prime Minister. As Chancellor of the Exchequer, he was responsible for coinage, and before he left that position he approved of a new set of British coins. For the first time since the reign of Charles II, the figure of Britannia had been removed from them.
The helmeted figure of Britain’s tutelary deity, clutching her trident as she sat with a lion at her feet, had first appeared on a Charles II farthing. It was at the time when modern British history was beginning. The country had been through a devastating civil war. Religiously and politically divided, it now came together in an era of extraordinary creativity. The Book of Common Prayer of 1662, the origins of modern science in the Royal Society, the architecture of Wren and, a little later, the political rationalism of John Locke, began the story of modern Britain. Britannia was its emblem. During the eighteenth century, as the country expanded, both industrially and colonially, Britannia became the symbol of its emerging self-identity. The separate quarters of the kingdom–Ireland, Wales, Scotland and England–were all one. The skills and arts of all its peoples contributed both to its colossal commercial success in the nineteenth century, and to its expansion throughout the world. Thomson’s old ditty ‘Rule, Britannia!’, set to music by Thomas Arne, was an anti-slaving song, which expressed the innocence, the exuberance, the confidence in their own rightness of the British people in the ascendancy.
Gordon Brown sent Britannia packing. It was a small thing, but it signalled the end which any observer had seen coming for decades, the strange dissolution of Britain itself, not merely (as Scotland’s independence seemed ever more likely) as a political entity, but as an idea. Britannia no longer ruled the waves. But this was not simply because her shipbuilding industry was decades-long dead, and her navy in decline; it was not merely because her last major colony had been restored to the monstrous and ever-expanding power of China; it was not merely because there was a European Union and a United States which were both bigger and more powerful than she was. It was because Britannia had, at some point during our times, ceased to exist. She had become a missing person. The families of such characters rehearse, over and over, the last day they saw their beloved daughter, husband, mother; how they waited for the return which never came. Such was the state of anxiety and heartbreak of certain conservative romantics as our times came to an end. They strained their ears and eyes for signs of a return: none came. For such as these, almost the most intolerable thing about the loss was that there were others in the family who, bloated by illegal narcotics and American junk food, sat in front of the television in the downstairs room and did not notice that she had gone.