Penguin walking logo

12

A Young Country 1990–2000

MAJOR TO MINOR

The fall of Thatcher marked an epoch in British politics. It was dramatic enough that a prime minister was forced from office in mid-term. There were only three precedents in the twentieth century: Asquith and Neville Chamberlain, who had both been removed during wartime crises of national survival for which large numbers of their own backbenchers judged them unfitted; and Lloyd George, whose Coalition Government fell apart underneath him. Never before in peacetime had a party with a secure parliamentary majority decided to withdraw its support from its own leader.

The Conservative Party plainly did so because it believed that it would lose the next election under Thatcher but might win under a new leader. There was evidence for both beliefs. The frightening opinion polls of the summer of 1990, showing Labour at least 15 per cent ahead, were transformed into a small Government lead as soon as John Major took over; and, though this effect proved temporary, the Conservatives thereafter fluctuated at around 40 per cent – not far short of Thatcher’s levels when she had three times won parliamentary majorities. This should have been seen as a significant reminder of continued Conservative vitality; instead, the fact that Labour had eventually pulled back to a comparable level of support – and was usually reported as enjoying a small lead – made more impression at the time.

It was thus essential that Major appear to be not-Thatcher. On first entering parliament, as recently as 1979, he had been identified not as a hard-core Thatcherite but as an efficient and emollient fixer, well adapted to his work in the whips’ office. Having become Chief Secretary to the Treasury, with the job of holding down government spending, he possessed the ability to deliver the necessary cuts, without unnecessary fuss, and this was perceived by a prime minister who did not know him well as a sign of his doctrinal zeal. Hence her readiness to promote him: first to the Foreign Office and then (all too quickly) back to the Treasury as Chancellor. She might, perhaps, have reflected that it was, above all, Major’s support for British entry to the ERM which forced her to accept this unpalatable measure in October 1990; but by then she was running out of political options. In the following month, defeated herself, she saw that the only way to keep out Heseltine was by backing Major, who was thus elected with Thatcherite votes.

‘I am not running as son of Margaret Thatcher,’ Major had declared. This was, however, the only role in which he was acceptable to his old patron, whose emotional hold over the party which she had led for fifteen years made her a continuing presence, ignored or slighted at peril. Thatcher remained an MP until the end of the parliament, before entering the House of Lords; but it was her extensive speaking commitments abroad, especially in the USA, which encouraged her to exploit her residual fame in speaking out, ever more vitriolically, against pro-European initiatives (including some which she had previously supported in government). These fitful interventions, embarrassing in more senses than one, were one reason why Major’s efforts to stake our his own ground were continually thwarted. Another was his inability to give substance to the aspirations for a ‘classless society’ which he saw as exemplified by his own remarkable ascent, with little formal education, from humble origins in the shabby south London terraces of Cold-harbour Lane, so evocatively Dickensian in its cadence.

A clear signal that this was a new government, with new priorities, was that Michael Heseltine, unapologetic and unsinkable, was restored to high office, initially as Secretary of State for the Environment. Here he became responsible for the poll tax – responsible, that is, for getting rid of it as soon as possible, as everyone now agreed. In March 1991 Heseltine instead introduced ‘council tax’, replacing the tax on individuals with a local property tax, much like the old ‘rates’, though now levied on the basis of valuation of properties within broad bands. Anomalies remained, of course; and Heseltine’s improvised measure brought no immediate relief to the Conservatives, who suffered another set of appalling losses in the 1991 local elections; but the name ‘poll tax’ was banished, and with it the most powerful symbol blighting the Major Government’s chances.

One effect of the three-year crisis over local government finance had been to revive the three-party system. The Liberals had, for a generation, underpinned their periodic revivals in parliamentary elections by more solid, if less spectacular, advances in local elections; and the advent of the SDP had not really changed this pattern, simply raised the stakes. But the open quarrel within the Alliance after the 1987 general election threatened to squander all these gains. The bulk of the SDP had agreed to merge with the Liberals in a new party, called the Social and Liberal Democrats in 1987 (peremptorily abbreviated to the Democrats in 1988 and, in deference to historic Liberal pride, finally settled as the Liberal Democrats in 1989). The trouble was that, until the rump of Owenites who clung to the name SDP ceased operations (effectively in May 1989), this attempt at consolidation had mainly produced fragmentation. Indeed, in the European elections of 1989 the Liberal Democrats were pushed into fourth place nationally behind the Greens, who exploited their opportunity to mobilize environmental concerns and thus record a flash-in-the pan 15 per cent of the vote. The demise of the Owenite SDP and the subsequent decline of the Greens, plus the stimulus of the poll-tax crisis, served to resuscitate the Liberal Democrats, led since July 1988 by the former marine Paddy Ashdown. From a nadir of little more than 8 per cent in the opinion polls in the last quarter of 1989 – virtually back to the levels from which Grimond had rescued the Liberal Party in the 1950s – the Liberal Democrats recovered with surprising resilience, and by 1991–2 were often touching 20 per cent support, their best since the previous election.

The modest and engaging style of the new prime minister was very different from that of Thatcher; but were his policies simply Thatcher-ism with a human face? There were inevitable continuities, notably in foreign policy, where the inexorable decline of British power in the world had been a constraint even on an Iron Lady who, despite victory in the Falklands, was unable to defy the logic of withdrawal from Hong Kong. Though it had been Thatcher, faced with the impending expiration of Britain’s treaty rights, who approved the Sino-British joint declaration of 1984, it was Major who had to implement it. He chose a close political ally, Chris Patten, to serve as the last governor of Hong Kong (1992–7), with the task of paving the way to an agreed handover to China. Patten’s efforts to strengthen representative institutions in the colony tallied with his own liberal instincts, but such democratic rhetoric would have sounded more convincing had it come earlier in the century of British rule. When the Union flag was lowered on 1 July 1997, Britain’s dominion over palm and pine was reduced to a scattering of remote islands and rocky outposts.

With the sun finally setting upon the British Empire, European policy was increasingly seen as the test case for Major. Here, in his early months in office, he had taken a more positive stance by going to Bonn and claiming that he wanted Britain to be ‘at the very heart of Europe’. Thatcher duly spoke out; the party was restless, not least over the idea that a common European currency was the way to sustain European unity in the aftermath of the unification of Germany. Thus when Major attended the inter-governmental conference at Maastricht in December 1991, his room for manoeuvre was constrained on all sides by his own political weakness. He knew that an election was imminent; he could see that Labour was neck and neck with the Conservatives and that the Liberal Democrats were resurgent; and his insecure grip on his own party left him still in the shadow of his all too active predecessor.

Under the circumstances, the Maastricht negotiations were a triumph for Major, battling with his back against the wall; they were certainly presented to Europhobe opinion at home as a doughty defence of British sovereignty. But Major was no handbag-wielding wrecker, and his European counterparts knew it. True, he was reluctant to go all the way with the other eleven nations, now intent on achieving a fuller European Union, in reforming voting procedures so as to dispense with the increasingly frustrating requirement for unanimity; but, rather than exclude Britain altogether, they proved tolerant of Major’s difficulties and ultimately granted him the right to opt out of two major commitments. Britain was left free not only to defer a decision on whether to join the projected common currency (later called the Euro) but also to reject the social chapter (later called the social charter). This introduced European-wide standards in industrial and employment legislation and thus, for free-market Thatcherites, represented the interventionist iniquities epitomized by ‘Brussels’. As so often with Major, the canny compromise which he achieved was the product of clever politics. Those of his own supporters who were reluctant to accept the Maastricht Treaty at all, even in its bowdlerized British edition, knew full well, as one minister put it, that the alternative was the Treaty ‘with social chapter knobs on’ under a Labour Government.

Step by step, the European Union was disclosing itself as the issue that polarized British politics at the end of the twentieth century. It did so with a new twist for party politics. Under Macmillan and Heath, the Tories had been the party of Europe; but for many of their rank and file the true legacy of Thatcher was an increasingly strident antipathy to all things European. Conversely, an equal and opposite reaction could be observed within the Labour movement, which had woken up to the thought that Brussels might be the place to confront and contain Thatcherism, insular socialism having failed to do so. Thus when Jacdues Delors, as President of the European Commission, addressed the TUC in 1988 he suddenly found a warm response; and trade-union enthusiasm for European regulation of working conditions became the hinge on which Labour Party policy turned 180 degrees from the 1983 pledge to quit Brussels altogether. Kinnock made himself the standard-bearer of change, ready to lead Labour into the general election of April 1992; he reproached the Major Government both for not signing up to the social chapter and for not following up Ε RM membership with a commitment to full monetary union.

This was an electoral battle fought under very different conditions from those of the Lawson boom in 1987. True, inflation was now falling steadily from its peak of over 10 per cent at the end of 1990; dipping under 4 per cent during 1992, it was to remain at or below that level for the rest of the 1990s. The percentage of the workforce unemployed, by contrast, was rising towards double figures as economic growth first began to falter, and then went into reverse; in 1992, GDP was clearly lower than two years previously. This was, albeit on a smaller scale, a recession like that of 1973–5, which had arguably brought down Heath, and of 1979–81, which had almost stopped Thatcher in her tracks – alarming precedents for Major to contemplate. Moreover, although the balance of payments showed an improvement once demand for imports fell, the need to defend sterling at a fixed rate within the ERM meant that interest rates remained stuck above 10 per cent.

Partly as a result, the property market had become a roller coaster on which the exhilaration of the upward swing gave way to a plummeting sensation in the pit of the stomach. Having doubled in four years by 1989, house prices then tumbled by 10 per cent over the next three years. Owner-occupiers of long-standing might watch their paper gains come and go with wry resignation; but young people who had recently bought into the dream of a property-owning democracy now found themselves victims of a new syndrome – ‘negative equity’ – when the inflated mortgages of the Lawson era exceeded the suddenly reduced value of their homes. These were fears that gripped more people than were actually caught; but they were real enough, especially among Conservative voters, thus reinforcing a general sense of gloom.

With few signs of economic recovery, even the Chancellor, Norman Lamont, found his naturally buoyant spirits difficult to sustain. Yet the Conservatives fought an undeniably effective campaign. Denied the opportunity to enlist general economic self-interest on their side, they instead enlisted specific fiscal self-interest. From the beginning of 1992 they decided to blazon ‘Labour’s Tax Bombshell’ as their theme. Though Labour’s Shadow Chancellor, John Smith, with his sober Scots mien of trustworthiness, may have looked like a bank manager, a number of customers took fright when he outlined a prospective Labour budget that admitted the need to increase direct taxation. Smith, so the post-mortem had it, thus handed his opponents a card that the) well knew how to play; and Kinnock, it was likewise held, compounded the error by appearing at an elaborately staged and prematurely triumphalist election rally – only to be excoriated by the polling-day issue of the Sun in correspondingly lurid terms (‘Nightmare on Kinnock Street’). Major, by contrast, made a wholesome point of speaking from an old-fashioned soap-box, albeit one fabricated by his props department.

There is much mythology here, seized upon to explain away a result that few had expected. For the fact is that Labour’s lead in the opinion polls – which had always been narrow – was out-trumped by a more solid Conservative lead of 8 per cent in the votes themselves. Such a last-minute upset could hardly have occurred but for pre-existing apprehension about Labour which was fed less by Smith’s candour or the Sun’s sensationalism than by the party’s own history over the previous fifteen years. Given this latent suspicion of a Labour Government, enough voters proved wary, in the event, not only of voting Labour but also of voting Liberal Democrat – for fear of letting Labour in by default. Neither Opposition party was able to deliver its promised poll, whereas Conservative votes exceeded Conservative promises.1 Hence the failure by the opinion polls to predict the outcome: almost as bad a result for them as for Labour and one which likewise provoked a determination not to make the same mistakes again.

Few prime ministers, replete with fresh electoral support, have had so little opportunity to enjoy it. Major was handicapped from the outset because the popular vote did not translate into parliamentary seats for him on the favourable terms from which Thatcher had always benefited. One factor was that Labour constituencies, often located in declining parts of the country, were shrinking and thus, pending the drawing of new boundaries, over-represented in parliament. Conversely, with a largor share of the national vote than in 1987, the Conservatives found their parliamentary majority cut from 100 to a mere 21. Given the pattern of by-election reverses now to be expected in Britain, this slender margin was always likely to be cut; by the end of the parliament the Conservatives, though still the largest party, had duly lost their overall majority; and the Government’s continuing need to count every vote enhanced the leverage of Europhobe rebels. The last thing that Major needed, therefore, was a crisis that simultaneously discredited his central policy stance while exposing his vulnerable European credentials to attack. This, however, was exactly what he got in September 1992.

The performance of the pound sterling within the Ε RM had for months given cause for concern. The original case for going in, as put in the mid-1980s, had envisaged a process of convergence and stabilization which was out of kilter with actual conditions in 1990, when the decision had eventually been taken – and taken, of course, as much on political as on economic grounds. The sterling exchange rate was too high at a midpoint of 2.95 DM – sometimes translating into over $1.90 against the American dollar – thus requiring high interest rates to defend the pound at just the time when the recession of the early 1990s required lower rates to help domestic recovery. This was an old dilemma. Major had no new answers. He even replicated the rhetorical excess with which Cripps had foreiworn devaluation in 1949, or Wilson and Callaghan in 1967. ‘The soft option,’ declared Major, ‘the devaluers’ option, the inflationary option, would be a betrayal of our future.’

On 16 September 1992, ‘Black Wednesday’, the dam broke. Faced with irresistible pressure on sterling, Major and Lamont were joined by other leading members of the cabinet – so as to spread the blame, some concluded – in watching helplessly, while interest rates were raised briefly to 15 per cent, before they had to concede the inevitable. Lamont was left to announce sterling’s departure from the ERM; he was likewise left as Chancellor for the time being. He saw no need to resign himself for simply doing the dirty work on behalf of a man whom he had helped make prime minister two years previously; but good relations between the two neighbours in Downing Street were one ominous casualty of the crisis.

Lamont himself became another when, in May 1993, he was replaced by Kenneth Clarke, a veteran of the Thatcher Government but one who now emerged with increasing prominence as a staunchly pro-European ally of Major. So did Heseltine, with whatever private mixed feelings, and his conspicuous show of loyalty to Major was later to be rewarded with the double-edged title of deputy prime minister. This was the trio at the top which, over the remaining four years of its term, guided the Government through thick and thin, especially the latter.

The paradox of Black Wednesday was that an outcome which the Government professed to dread, and which indeed provided it with a dreadful moment, had in fact supplied it with a viable economic policy. The inheritance which faced Clarke certainly had its problems, of which the biggest was a huge hole in the budget. Measured by the traditional figures for revenue and expenditure (Consolidated Fund), the deficit was over 10 per cent of GDP in 1993–4: the highest figure ever recorded in peacetime. Responsibility for it lay more with Lawson’s failure to harvest a big surplus at the top of the boom, with Major’s complacency during his own year at the Treasury and with Lamont’s belated response to the impact of the recession upon the public finances than with anything done by Clarke. But he was the Chancellor who had to clear up the mess, which he did with a judicious combination of firm checks on public spending and stealthy increases in almost any kind of taxation – except income tax, which he was later able to reduce by 1 per cent in both 1995 and 1996. Labour derided all this in opposition, copied much of it in government.

Clarke’s real success at the Treasury lay in playing the good cards in his hand with a cool nerve that bordered on insouciance. Personally relaxed in comportment and apparel alike – more inclined to console himself with a pint of bitter than to worry about scruffy suede shoes – he brought a much needed sense of calm authority to a harried Treasury. He never talked about Black Wednesday as a devaluation, of course; but its effect was to devalue sterling against its main competitors by at least 10 per cent, with rates of around 2.50 DM or $1.50 becoming normal. This was the opportunity for British exporters, producing a shift into broad parity on the balance of payments within a couple of years. Interest rates were free to fall to levels of around 6 per cent, suitable to foster domestic growth. Even at the price of a public clash with the governor of the Bank of England on one occasion, Clarke showed himself determined to sustain this policy, which broadly vindicated his own judgement by results. Given that the economy was now in better balance, it was possible to combine economic growth, which went as high as 4.7 per cent in 1994, with continuing restraint of inflation.

Here, then, was the Government’s economic success story; yet when economic recovery came, it proved to be a ‘voteless recovery’. Major’s own standing had been fatally undermined: not just a temporary setback but a permanent loss of confidence in his judgement, demonstrably linked to Black Wednesday. The opinion polls showed that Labour immediately established a lead which was to be maintained continuously until the next general election; indeed from May 1993 there were 44 successive months in which the Conservative showing in the Gallup index never rose above 30 per cent. Prevailing electoral folklore about ‘mid term blues’ simply disguised, for a while, the nature of a collapse unprecedented since the internecine disputes over Tariff Reform which had signalled the end of a similarly prolonged period of Conservative ascendancy ninety years previously. Now it was the hardly less fissile issue of Europe which divided and doomed the Conservatives, despite the objectively favourable economic record. In an era in which electoral behaviour has too easily been explained as determined by underlying social and economic factors, it is a liberating insight to recognize the limitations of such explanations and acknowledge how much can turn not only on the impact of events but also on that of political leadership.

NEW LABOUR

As leader of the Labour Party, Neil Kinnock had much to his credit by the time he stepped down after the 1992 general election (except losing it, of course). By 1990 he had presided over a home policy review that repudiated the lurch to the left of the previous decade in favour of a stance at least as pro-European and anti-inflationary as that of the Major Government. This extended, it should be noted, to support of the ΕRM – which might well have proved equally embarrassing for a Kinnock Government. As it turned out, a fourth lost election gave Labour’s self-conscious ‘modernizers’ a further argument for fundamental reform of the party and a further opportunity to effect it. When John Smith duly took over as leader of the Opposition, he too was committed to giving Labour a public face of electable moderation; and he too was reliant on the voting power of the trade unions within the party to deliver these changes. Yet many younger modernizers, notably Gordon Brown and Tony Blair, emerging as the stars of Labour’s team, now saw the entrenched role of the trade unions as a crucial anomaly, making reform of the party constitution itself into a priority.

Smith was not the natural leader of such a crusade. Happy enough to watch the Government digging itself into an ever deeper hole, he was only reluctantly persuaded to take up the campaign for One Member, One Vote’ (OMOV) in the selection of Labour parliamentary candidates. To be sure, the reform was hedged with small print, and selection of the leader remained with an electoral college, equally divided among MPs, trade unions and ordinary members. It was the principle of OMOV, however, that counted; and it was on this that Smith staked his credibility as leader at the Labour Party conference in September 1993. This was a decisive step, and a successful one, but it proved to be his last. Within eight months he was dead, struck down by a sudden heart attack.1

In May 1994, therefore, the Labour Party was prematurely faced with choosing a leader from a new generation. There were two obvious candidates. For some years Brown had stood out from his contemporaries as a potential future leader, with an intellectual grasp and an obvious moral integrity that befitted his upbringing as the son of a minister of the Church of Scotland; he notably made his mark through the aplomb and acuity with which he deputized for Smith, a fellow Scot, as Shadow Chancellor. In comparison, Blair was a recent recruit to the front bench. He was quick and clever, and his initiation as employment spokesman under Kinnock had been to face down the trade unions in extricating the party from commitment to the closed shop, which he achieved with a characteristic mixture of strategic boldness and tactical dissimulation. But his blossoming came as Shadow Home Secretary under Smith, when he showed a remarkable talent for projecting his ideas in a way that generated favourable publicity and thus, in two short years, established a charismatic public profile.

The two men had a broadly similar modernizing agenda; they had quickly established a rapport, sustained by close and trusting political cooperation; they had a reasonable expectation of working together effectively in high office at no distant date; they may each have contemplated which of them might ultimately succeed as leader himself; they had no idea that such a choice would be required so soon. Within days of Smith’s funeral, a decision was taken – exactly on what basis remains disputed – that Brown would not oppose Blair in standing for the leadership. The cause of the modernizers was thus ensured a sweeping success in July 1994, when Blair won a clear majority of votes, not only from Labour MPs and party members but, more surprisingly, from the trade-union section of the electoral college (even though the remaining candidate was the trade unionist John Prescott).

Blair was forty-one: two years younger than Brown, ten years younger than Major, twenty-seven years younger than Thatcher. This was a changing of the generations which the new leader showed himself adept in turning to his own advantage. Rather than standing rebuked for lack of experience – he was not even an MP when the last Labour Government had held office - Blair proclaimed that it was time to move on. He explicitly accepted many of the changes of the Thatcher era as given; he graciously (and cunningly) accorded respect to Thatcher herself; but he insistently called for a post-Thatcherite agenda capable of rallying the left as well as the centre to Labour’s side. His defeated rival, the forthright former trade unionist John Prescott, fifteen years older, was subsequently cosseted as deputy leader in order to bind up the party’s wounds; but the ‘beautiful people’ (as Prescott called them) in the new leader’s coterie regarded him as ‘out of the loop’. Peter Mandelson, born in the same year as Blair, had been a party apparatchik under the Kinnock regime, and, famed for his silky communications skills as a ‘spin doctor’, was publicly identified as privately influential: more eminent than grey. Blair capitalized on youth as a virtue in itself and as a metaphor for political rejuvenation. His vision of socialism, so he was to assure an initially bemused party conference in 1995, was that of ‘a young country’.

Blair’s first steps as leader already showed him intent on using his honeymoon popularity for serious political ends. At the party conference in October 1994 the theme was ‘New Labour’; and the fact that this was not just ‘spin’ gradually dawned on delegates who heard Blair call for the rewriting of Clause IV of the party constitution, enshrining the doctrine of public ownership. It was a move that paralleled OMO V in seeking to rebuild the party: having shed its trade-union identity, it would now shed its historic socialist commitment. Blair thus asked a lot; but his courage in taking his campaign to the members themselves, rather than relying on the block votes, was an instructive contrast with Gaitskel’s abortive attempt to change Clause IV a generation previously. Indeed, with the national membership now growing in the process, this proved to be a high point of mutual trust between Blair and his party; once in power, he proved more inclined to attempt control from above, as was shown by his heavy-handed efforts to determine the choice of Labour candidates for mayor of London and leader of the Welsh Assembly.

It is as fallacious to suppose that the advent of Blair changed everything in British politics as it is to assert that it changed nothing. Under Kinnock the modernizers had already brought Labour back from the brink, and under Smith the Conservatives were already stumbling. Blair was capitalizing on a position of strength, and the impact of his leadership was seen in various ways. He was the first leader of the Opposition since Harold Wilson in the early 1960s to seize the initiative at question time in the House of Commons and to wrest the agenda from the Government. He became a master of the ‘sound-bite’, encapsulating a policy stance in a quotable phrase, a process scorned by Major, though unavailingly.

Media attention flattered Blair, with his youthful demeanour often uncritically accepted as an earnest of wholesale political renewal. He played to a perception that he was not just another Labour politician. He came from a middle-class background, had attended a fee-paying school and had graduated from Oxford University. So had Attlee, Gaitskell and Foot before him. Blair, however, was the first to declare his lack of proletarian credentials as an asset. He claimed intuitive understanding of the fears and hopes of ‘middle England’ and sought to identify its professed values, whether flag-waving patriotism or upwardly-mobile aspirations, with New Labour as the party of ‘one nation’. These were, of course, traditional Conservative symbols and slogans, coolly appropriated in a vote-stealing exercise that self-consciously mirrored Thatcher’s earlier success in appealing to a traditional Labour constitutency. The young Blair family were the epitome of New Labour, with their gentrified terrace house in Islington and with two professionally ambitious parents juggling the competing demands of their jobs and their three children. (A fourth was later to be born in Downing Street.) The worst taunt that faced the Labour Party was no longer that it was old-fashioned but that it was trendy; but, if so, this was undeniably a popular trend.

Month in, month out, Labour was scoring well over 50 per cent in the opinion polls. The contrast with Conservative disarray was cruel. Repeatedly, Major was thwarted by the rebels in his own party. When the rejection of the Maastricht Treaty in a Danish referendum in 1992 had forced him to defer British ratification, he found himself within a few votes of defeat in the House of Commons. Worse, in the summer of 1993 the Government lost a European vote and had to put down a confidence motion in order to survive. Still unreconciled, eight Conservative MPs persisted in voting against the Government in November 1994. Major fought back, first withdrawing the whip from the rebels, without effect; then restoring it five months later, also without effect. His final ploy was to submit himself for re-election as Conservative leader in June 1995, challenging his critics to ‘put up or shut up’.

It was not difficult to guess whom he had in mind. Michael Portillo, a favourite of Lady Thatcher at the time, was the most prominent and flamboyant of the right-wing cabinet ministers whom Major was known to regard as ‘bastards’. But Portillo sat tight during this leadership contest; instead, it was the forbiddingly aloof John Redwood who showed the courage to resign from the cabinet. Redwood thus championed the cause of the economic liberals and Eurosceptics who looked back on the 1980s as a golden age, though Lady Thatcher herself remained uncharacteristically circumspect about her own preferences. Major won by 219 to 89, with 22 MPs abstaining, and proclaimed victory. True, he had polled 66 per cent support, compared with Thatcher’s 55 per cent in the first ballot in 1990; but a third of his own party in the Commons was no longer behind him – about the same proportion that deserted Neville Chamberlain in 1940.

Though the Thatcherite votes which had first elected him leader had peeled away, it was unjust to accuse Major of betraying Thatcherism. The sort of initiatives that had caught the mood of the 1980s, however, did not necessarily wear well. Privatization had been a great theme in its day but perhaps the bottom of this barrel was in sight when the formula was remorselessly applied to the railways. Not only was there little public enthusiasm for this measure – something that had initially been true of other privatization measures – but the plan betrayed signs of incoherence, especially the separation of responsibility for the track from that for the rolling stock. Despite talk of ‘a poll-tax on wheels’, the franchising of the British Rail network to private companies was rushed through before the 1997 election. A spate of harrowing train crashes, in which track maintenance was a factor, subsequently found many people muttering ‘I told you so’ (even if they had not).

Likewise, deregulation had often seemed a fine cry in the 1980s, with the awkward questions coming later. How much was it responsible for permitting lax procedures in both the feeding and slaughtering of animals? It had gradually emerged during the 1980s that the eruption of BSE in cattle (‘mad cow disease’) was particularly prevalent in the United Kingdom because of market-driven practices that were more effectively policed elsewhere. This was bad enough; but the mounting evidence that BSE could cause variant Creutzfeldt-Jakob disease in humans warned of alarming possibilities, hitherto officially denied. A ministerial statement admitting the link was made in March 1996, followed by a virtually universal ban on the export of British beef, which became a particularly sensitive issue within the European Union, thus exciting some predictable responses on all sides. There were indeed market responses to this crisis., as consumers shied away from beef (though less in Britain than on the Continent) and British farmers faced the long haul of re-establishing the export trade. Nobody, however, argued for simply leaving this crisis to the market to sort out; stricter regulation was back; and, though cheap political gibes were out of place, the fact that Britain led the world in these gruesome statistics was a cause for rueful reflection.

Undeniably, Major was an unlucky prime minister. He had to take the blame for events beyond his control; but he did so with a hapless air that inspired personal sympathy rather than respect for his leadership Nowhere was this more obvious than when this ostensibly upright man found himself engulfed in a string of scandals, many of them initially trivial, which nonetheless tainted his government with an image of ‘sleaze’. The fact that Major’s speech to the 1993 Conservative Party conference was packaged under the catch-all title ‘Back to Basics” was a hostage to fortune; although not intended as a sermon, it was taken as preaching a text from which any moral lapse would be severely judged.1 Thereafter, at disconcertingly frequent intervals, sexually titillating revelations about ministers appeared in tabloid newspapers; when resignations followed, they were usually belated, making the prime minister’s ineffectual support an issue in itself.

More serious were the linked allegations about abuse of parliamentary procedure (‘Cash for Questions’) and the acceptance of compromising hospitality by MPs. This led to widespread debate about standards in public life and, more specifically, to the well-publicized disgrace of two prominent Conservative ex-ministers.1

One result was that the House of Commons set up a new Standards and Privileges Committee, requiring MPs to register their financial interests. On all of this, the Conservatives gave the impression of dragging their feet. Labour, by contrast, proclaimed impeccability as the new standard; this might be seen as its own version of ‘Back to Basics’, thereby ensuring that a future Labour Government would be vulnerable to any perceived infractions of its self-proclaimed code – as duly occurred, especially over donations to the party. Lloyd George and Churchill would never have survived scrutiny of their finances under the stringent regime in place at the end of the century. The net result of imposing higher expectations was to diminish public confidence in the integrity of all politicians; but it was the Conservatives who were hurt by ‘sleaze’ in the short term.

Like Balfour before him, Major was grimly determined not to cede office, whatever the humiliations of his position or the hopelessness of his prospects. It was long taken for granted that the parliament elected in April 1992 would last its full five years; taken for granted also that the Conservatives would finally lose power after a unique spell of eighteen years ¡n office. Their only hope of moderating defeat, when no longer expecting victory, seemed to lie in the economic record. Yet the achievements of Kenneth Clarke at the Treasury, though widely recognized, could not be made central to the Conservative campaign, because, within his own party, he had become virtually isolated on account of his pro-European views. The official line, maintained within an increasingly fragile cabinet, was to keep open the possibility of British participation in the single currency, scheduled to begin in 1999.

Opinion polls on the Euro (as it was now called) showed the public against British entry by a margin of two to one. Hence the alarm among Conservatives in 1995 at the formation of a new party, run by the financier Sir James Goldsmith, with the sole aim of forcing a referendum on British membership of the European Union. This issue was given much publicity in Rupert Murdoch’s newspapers, especially the Sun, still preening itself on its self-proclaimed role in swinging the previous election against Labour but now poised to endorse New Labour under Blair. In April 1996 Clarke had agreed, with some reluctance, that there should, at any rate, be a referendum before Britain adopted the Euro; and, within months, Labour promised the same. Formally, therefore, Government and Opposition were agreed. It is too simple to call the referendum a device for masking the fact that the Conservatives were against the Euro and Labour in favour; but it certainly served different purposes in each party. It notably allowed Blair to express, in the pages of the Sun, an emotional attachment to the Queen’s head on the currency – sentiments curiously at odds with his supposed wish to join the Euro, though not with his stronger wish to win the election.

New Labour was the big story in the general election, with polling day finally fixed for 1 May 1997. Conservatives might warn that New Labour was a fraud: a purely cosmetic operation which concealed the old familiar socialist monster. In fact, Blair had anticipated and thus pre-empted such charges by telling the party that it must openly acknowledge the changes it had undergone – the more painfully the better in proving that it was not the creature of an atavistic trade-union movement. Brown had underpinned this strategy of reassurance, not only with a robust reaffirmation that a Labour Government would adhere to the public spending guidelines already set by Clarke for the next two years, but by adding a pledge that rates of direct taxation would not rise. Thus were the ghosts of past Labour defeats ritually exorcized.

Labour’s landslide victory had long been forecast by the opinion polls; but it had been disbelieved almost as much as it had been discounted. In fact, Labour polled 44.4 per cent, some way short of predictions. Still, the electoral swing against the Conservatives was the biggest since 1945 and reduced their number of MPs to 165, their smallest number since 1906. Their own divisions undoubtedly contributed to this result, with disagreements surfacing during the campaign, as individual Conservative candidates broke rank by declaring themselves against entry to the Euro or even against membership of the European Union itself. This was partly through fears of being outflanked by more extreme Europhobic candidates, of whom those put up by the Referendum Party were the most prominent. The latter’s national vote, at 2.7 per cent, may have been small, but in a number of constituencies its poll exceeded the margin by which Conservative MPs were defeated. Though analysis suggests that this was hardly a significant reason for the Conservative débâcle, it was seized upon to reinforce the anti-European bias of the depleted party that trooped back to Westminster. Influential Conservative newspapers, notably the Daily Telegraph, were unabashed in their belief that it was the European issue that would sweep away the Blair Government.

This was Blair’s election. In the new parliament he had over 400 supporters, comparable with Attlee in 1945 and Campbell-Bannerman in 1906; but neither had dominated an election campaign as Blair did in 1997. This was partly a matter of contrivance, with a well-disciplined party machine which had learnt from Clinton’s Democrats in the USA the importance of staying ‘on message’, and which was backed by the techniques of focus groups and rapid-rebuttal procedures. Exploiting the dark arts of the spin doctors, in their back rooms, Blair himself could take the public stage in an aura of sweetness and light. He succeeded in making his own unchallenged leadership into a self-evident guarantee to nervous middle England; at the same time he made it a focus of inspiration for those who nurtured more demanding hopes as they helped elect a Labour Government – a new experience for any voter under the age of forty-one. Blair’s response to victory – ‘A new dawn has broken, has it not?’ – risked bathos but was, for the moment, accepted at face value.

This result was produced not only by the strength of Labour’s national vote but also by its distribution. The electoral system, which had favoured the Conservative Party in the 1980s, now drove nails into its coffin. It lost previously safe seats to Labour, such as Enfield in suburban north London, a defeat for Portillo; it also lost them to the Liberal Democrats. With 17.2 per cent of the vote, a slight decline, the latter nonetheless increased their representation in parliament to 46 MPs, the biggest third party since 1929, in Lloyd George’s day.

The new prime minister, though confident of winning, had not expected this landslide. Indeed he had a contingency plan, consistent with his aim of broadening his political support, for a possible coalition. As leader of the Liberal Democrats, Paddy Ashdown had been assiduously courted by Blair, and the two men shared confidences about a joint ‘project’, as they called it, building on sympathetic political affinities. They were right to sense that many of their supporters in the country shared a common objective in getting rid of the Major Government. Unlike 1992, most Liberal Democrats were now unafraid of a Labour Government under Blair; unlike the early 1980s, many Labour supporters were ready to vote tactically for Liberal Democrats wherever they had a chance of displacing sitting Tory MPs. But tactical voting on this new scale in 1997 did not lead to cooperation in government; instead, by inflating Labour’s own majority, it made the idea of coalition seem quixotic, as was recognized quickly by Blair and more slowly by a disappointed Ashdown.

The fact was that the Blair project relied for its impressively widespread appeal upon its inclusively elastic definition. In his rhetoric Blair often claimed a Liberal as well as a socialist pedigree for New Labour; and he sometimes spoke of a pluralist conception of politics that could embrace electoral reform. But, after 1997, he found himself a prime minister with a commanding majority, produced by an electoral system that had served his party well. It was actually Blair’s capture of the Labour Party that provided the institutional support for his project – a new project, perhaps, but one now tied to party interest. Hence some of the ambiguity in his claim, on taking office: ‘We have been elected as New Labour and we will govern as New Labour.’

CELEBRITY!

In August 1997 a road accident ended the life of Diana, Princess of Wales. That she was only thirty-six and the mother of two boys not yet out of their teens would in any case have given her death a tragic personal dimension. That she was once the wife of the heir to the throne meant that there was also bound to be public interest. Buckingham Palace had considerable experience in handling such matters with a taciturn sense of decorum and a carefully calibrated demonstration of grief and mourning. None of this, however, proved adequate to this particular royal occasion. Princess Diana, as she was commonly if incorrectly known, had already shown an ability to generate widespread popular sympathy, displaying an instinct for public relations which some called intuitive and others manipulative. She had talked on television about the breakdown of her marriage; she had led a high-profile social life, both courting and evading the attentions of the paparazzi; her death came about in Paris, in the company of a playboy millionaire, in the course of a high-speed car chase. It was not the Palace but Downing Street that deftly caught the popular mood, with Blair’s immediate comment that Diana was the ‘People’s Princess’. The outburst of emotional public mourning that followed, with flowers piled shoulder-high at Kensington Palace and streets lined with crowds at her funeral, was wholly unexpected.

True, the impact of the death of Churchill, or even Nelson, could be invoked as precedents. But the obvious fact that they had been heroic figures with striking public achievements to their credit pointed up the contrast, for Diana was celebrated less for what she had done than for what she was. In this, it might be said, she was like other members of the royal family, with a role defined by birth or marriage; but she was unlike them in having also acquired the status of a celebrity, in the sense of being famous for being famous. It was her personality, as projected through the media and as imagined by a receptive public, that evoked the visceral response to her sudden death: a response that could be rationalized but that was essentially non-rational and, as it turned out, ephemeral. At the time, what seemed odd and novel, even disturbing, was the revelation that the British had shed so many inhibitions. In fact, signs of changing conventions were already apparent. For instance, when Paul Gascoigne, a famously impulsive footballer in the England team, publicly wept over a refereeing decision during the 1990 World Cup, he might have expected reproof for failing to take his medicine with a stiff upper lip. Instead, the incident was the making of ‘Gazza’ as an instant folk hero (with a correspondingly transient appeal).

Such examples, of course, can be asked to bear too heavy a freight of social interpretation. Rather than a total change in sensibility, what is disclosed here may be better understood as an incremental shift in idiom, as fashions change. The idea that the British – or rather the English – are by nature dispassionate and reserved is an obvious national stereotype, often contrasted with a supposed capacity in Latin or Celtic nations for warmth and emotion. Clearly, such images may be self-reinforcing, not least insofar as they are ambivalent, cherished as implicitly affirming an identity even when it is explicitly satirized. But these are cultural constructions, and susceptible to adaptation over time. Thus the English aptitude for deliberate understatement continued to pervade the national culture throughout the century, but it did so in protean ways, long after the clipped, upper-class accents that had given Nöel Coward his métier had been laughed off stage. The form itself survived, with dry, terse, unsentimental, euphemistic or allusive comments hinting at an emotional hinterland never formally explored. It is faithfully captured in Graham Swift’s novel Last Orders, which won the Booker Prize in 1996 and was made into a distinguished film five years later, with Michael Caine and Tom Courtenay adept at encapsulating a lifetime’s vicissitudes through a laconic remark, delivered deadpan, in the setting of a South London pub. Understatement may thus be characteristically English but is not timeless, still less innate. Such national characteristics have to be understood as an art-form or a code, one which is learnt, or at: least conditioned, and thus relies upon a transmission mechanism between generations to sustain it.

If culture has traditionally been passed from the old to the young., however, popular culture in the late twentieth century often inverted the: process. This inversion mirrored another, whereby high fashion saw a social elite mimicking street styles. The cultural norms established by young people, instead of remaining specific to their generation, acquired more widespread acceptance in a society geared increasingly to the tastes, language, behaviour, pursuits, sports and fashions of youth. This built, of course, on the existing salience of a distinctive youth culture since at least the sixties and to some extent reflected the self-image of earlier cohorts, who resisted the staid expectations and conventions of middle age. If it could be said that in 1960 only people under the age of thirty wore jeans, it could be said that in 2000 only people under the age of seventy wore jeans - the same people, forty years on. It is a matter of opinion whether Britain had become the young country of Blair’s rhetoric or one that simply refused to grow old gracefully.

Informal dress, often originating in teenage fashions, like punk and hip-hop, became more widely acceptable. Restaurants and theatres relaxed their dress codes, partly in pursuit of a younger clientele; many employers allowed styles of dress at work that would once have been thought suitable only for a holiday weekend. Women wore trousers as a matter of course, not only at work (after a number of employment discrimination cases had been fought and won) but even on the most formal occasions, whether to take a university degree or to attend a state banquet. Men now found that, even with a suit, a tie was often dispensable, being replaced by a roll-neck top or by a shirt fully buttoned to the collar (if, indeed, the shirt had a collar at all).

The American influence was manifestly strong in the range and style of casual clothes, from jeans and chinos to trainers. Sweatshirts and t-shirts became standard items of apparel for both sexes, as did baseball caps and variants on tracksuits. These originated from American sports-wear though it did not take long for domestic equivalents to enter the market, notably the adaptation of rugby jerseys as informal clothing for both ¡men and women. The prominent logos on many of these garments were originally institutional, often reflecting support for college or university teams, but their commercial possibilities were open to exploitation in various ways. Manufacturers’ labels, once discreetly tucked inside a neckband, were replaced by prominent logos blazoned on caps, sweatshirts or sports shoes – indeed, it became virtually impossible to buy such items in an anonymous form. Authentic sportswear suppliers were ¡superseded by brands that were well known for being well known: not manufacturing anything themselves but outsourcing their supply, often! in third-world countries; above all, carrying all the chic of the right name and the right label. These were fashions that migrated from teenagers to their parents, in an up-market trajectory from the sports field to the high street to the fashion house. Thus brand-name snobbery, once a discreet status symbol, was now overtly displayed even on formal clothes. The distinctive plaid lining of Burberry mackintoshes, for example, became a worldwide fashion accessory, initially through a range| of scarves and handbags.

The importance of branding was the marketing lesson here. It was not just commercial companies that came to appreciate the importance of a brand image in identifying themselves and what they learnt to call their product. The history of New Labour, with the red rose as its logo, can be told in these terms. The name under which an organization promoted itself, moreover, often needed to be elastic in its provenance, given the possibilities for vertical and horizontal integration between diverse activities, often far removed from its core business. Commercial organizations sometimes chose to retreat behind initials, which did not specify particular business activities and which were equally devoid of embarrassing connotations in any language. But better still was a free-floating name, like Virgin, which was as apposite for an airline as for a record or a soft drink, with the ambition, as the company’s extrovert founder Richard Branson explained, to ‘build brands not around products but around reputation’. The crux was the importance of a visually distinctive logo attached to a well-advertised name, evoking recognition and enlisting partisan loyalty.

The wheel came full circle when such strategies, derived from the model of competitive team games, were applied to professional sport. Again the American example led the way, not least in showing the possibilities of marketing sports teams or clubs as ready-made brands, able to draw upon generations of customer loyalty. By 2000 Manchester United was making £25 million a year by merchandising not only replica soccer kit in its distinctive red strip, and the spin-off calendars, magazines and videos directly relating to the club, but also a range of items from duvet covers to whisky. This was an important part of a business plan that put it alongside the Dallas Cowboys as one of the two richest sports clubs in the world.

The force that made ‘Man U’ into the leading brand in British (or European) football was not so much sporting skill as business acumen. Its rise was symbiotic with that of satellite television. Rupert Murdoch, no sports fan himself, was alert to the possibilities of using a franchise on sport as a battering ram, one that would enable his BSkyB channel to penetrate an initially resistant market. Hence his bold decision in 1992 to bid over £300 million for a four-year contract with the leading clubs, splitting them from the rest of the Football League by creating a new Premiership division. Football coverage was a gamble which handsomely paid off for BSkyB when it signed up a million new customers, while the elite clubs in the Premiership enjoyed vastly increased revenues. In 1998 Manchester United would actually have been sold to BSkyB for £625 million but for the intervention of the Monopolies and Mergers Commission, which judged that this degree of concentration of ownership of closely linked business activities would work against the public interest.

Other sports, to be sure, had their hour in the sun and, more crucially, on the screen. Sponsorship deals gave a further twist to the publicity spiral. Wimbledon retained its prestige as a tennis tournament, reaching a worldwide television audience, while at home, in an annual cycle, dizzy hopes of success for British players were as regularly raised in the press as they were dashed on the courts. Cricket likewise seemed to be a pursuit that the English had invented in a more leisurely era of white flannels and cucumber sandwiches, only to see their own subsequent proficiency cruelly exposed by the rigours of speeded-up, time-limited international competitions, with players clad in the sponsors’ media-friendly, colour-coordinated kit that looked good under the lights. English Rugby, it is true, flourished under this professional code, which injected a new pace into a high-scoring game just as it literally projected new logos on to the pitch, in a process that saw the English team rediscover the ability to match the performance of the previously dominant southern hemi-sphere. With significant sums from television coverage now flowing to a relatively small elite of top clubs and top players, rugby union football – largely eclipsing rugby league – had become, albeit on a smaller scale, more like soccer, which overwhelmingly remained what most people understood by ‘football’.

Football, ‘the beautiful game’, acquired an unprecedented dominance in popular culture in the late 1990s, whether measured in terms of media coverage, popular following or financial rewards. Professional footballers, once tied to their clubs by highly restrictive contracts, had notably improved their terms of employment from the late 1970s; but it was not until 1995 that they established free agency and thus achieved full bargaining power. A top player for a top club, like David Beckham of Manchester United, could already command pay of £10,000 a week by 1997 – more than the annual earnings of footballers in an earlier generation. Now big earners, some inevitably became big spenders, their nightclub revels attracting the kind of media attention that sat ill with an image of sporting dedication but provided high-octane publicity. Moreover, a player like Beckham, with his personable good looks, was now able to negotiate endorsement deals, for haircream or sunglasses, which multiplied his earnings – and in turn generated a degree of fame off the pitch which could be translated back into improved contracts with his club. When he became captain of the England team, he was more than a talented footballer: ‘Becks’ was treated as a national icon.

It was not uniquely British, of course, to invest the fortunes of national teams with wider surrogate significance, as the success of a multicultural French team in winning the 1998 World Cup aptly illustrated; but France could at least express a shared identity focused on a single national team. Not so in the United Kingdom. International rugby matches, in particular, had primarily meant those between the four ‘home countries’ (with France making this a Five Nations championship, later expanded to Six Nations with the admission of Italy). England versus Scotland or England versus Wales were contests that annually enlisted ancient national sentiments and rivalries, as the display of the blue-and-white Scottish saltire (St Andrew’s cross) or the red-and-green Welsh dragon well attested. But whereas the half-hearted (and inappropriate) riposte of English supporters used to be the Union flag, during the 1990s the red-and-white St George’s cross was readopted as the symbol of the England team, first in football, then in rugby and cricket. The English flag, hitherto to be seen flying only on Anglican churches, thus became a common sight, in turn encouraging the private display of the Union flag, previously to be seen only on public buildings or at right-wing political rallies. Here too New Labour was not slow to identify itself symbolically, notably with a mass of hand-held Union flags to welcome Blair to 10 Downing Street in 1997. The pop-culture nationalism of the football stands, focused on England’s performance in the World Cup in June 2002, was seamlessly fused with the contemporary celebration of the Queen’s Golden Jubilee, prompting the simultaneous display of the Union flag and that of St George on a massive scale throughout the towns and villages of England.

This bunting bonanza was another remarkable change in a people often presumed oblivious to this kind of symbolic ostentation. True, the ‘Union Jack’ image had always been familiar, often appropriated since the sixties in a tongue-in-cheek way that owed something to the Last Night of the Proms in its ambivalent ‘cod’ patriotism. The revival of British pop music in the mid-1990s drew on this idiom. ‘Britpop’ consciously harked back to the Beatles in seeking to challenge American dominance of this market, only to find that its own long-term impact was limited by failure to discover musical talent of the calibre of Lennon and McCartney. Still, the runaway success of the Spice Girls in the USA with their first single, ‘Wannabe’, in 1996 surpassed the comparable sales of the Beatles; and ‘Wannabe’, as well as topping the British charts for seven weeks, reached the top of the charts in thirty-two other countries. Challenging the success of previous boys’ bands, this carefully calculated campaign to launch a group of five girls, with an in-your-face appeal, acknowledged Thatcher as one role model, and exploited both feminism and femininity. The Spice Girls had ‘attitude’; they wore Union Jack knickers; they made themselves the top brand as well as the top band. As one of them later said: ‘We wanted to be a “household name”. Like Ajax.’ The question of whether they had authentic musical gifts, which could be sustained in live performance, was secondary to their ability to project themselves through choreographed song-and-dance routines, taking the form of singles, albums, videos and films. For those who could not readily recall their names, they were handily packaged with labels: Ginger Spice, Baby Spice, Scary Spice, Sporty Spice and Posh Spice. They were, so behind-the-scenes shots presented them, ordinary girls keeping faith with their fans, though now leading extraordinary lives which magically brought them fame.

This was a celebrity world in which the media was always alert to the possibility (and sales potential) of fairy-tale romance. By 1997 Victoria Adams was already a celebrity as Posh Spice; she was already worth millions |(perhaps £24 million by 2000); she was already consumed with an ambition to be ‘as famous as Persil Automatic’. Whatever next? It was a tabloid dream come true when the news broke that ‘Posh and Becks’ were |’an item’, and when OK! magazine subsequently paid £1 million for the pictures of their lavish wedding, the event became self-financing: a marketing marriage made in heaven. Some now talked of the Beckhams in the rapturous tones once reserved for royalty, or at least for the late Princess of Wales.

Here was a point of confluence between two hitherto strongly gendered subcultures. One was the laddish world of the football fan, as conveyed in Nick Hornby’s well-observed Fever Pitch (1992); and the other was the world of gay men, attracted by Beckham’s gentle, androgynous style (which notably did not diminish his own popular appeal). Hence the bridge to the the girl-talk, fashion-conscious sphere that Helen Fielding caught in Bridget Jones’s Diary (1996), originally written as a newspaper column. With a plot that was a pastiche of Pride and Prejudice, it successively became a bestseller and then a successful film (2001), with the American actress Renée Zellweger in a convincingly Anglicized performance. The production company that made this, Working Title, had likewise imported Hollywood stars for other hit films, notably Four Weddings and a Funeral (1994), which had also achieved international receipts of over £250 million, taking the old genre of an exportably quaint, picture-postcard Britain, but purveying it in the idiom of the 1990s. By contrast, The Full Monty (1997) was less predictable in bursting upon international success, with its stark northern background of industrial redundancy prompting the unemployed heroes to exhibit their virility in the improbable guise of male strippers: a parable of the flexible labour market that asserted changing mores in more ways than one. The multicultural Britain of the late twentieth century was perhaps best captured on film in Mike Leigh’s sensitive and nuanced exploration of family tensions, Secrets and Lies (1996); and conveyed in print, in a way that jumped off the page, in the remarkable first novel White Teeth (2000), which instantly made Zadie Smith a celebrity author.

Though the cult of celebrity was obviously not wholly new, its purposeful propagation through the media of the late twentieth century made it more pervasive. Just as high fashion now fed off popular styles, so in the arts there was a premium on ‘street credibility’ in publicizing celebrity practitioners, often driven by sponsorship deals. To some extent this was a conscious attempt to break down barriers to participation, with a celebrity conductor like Simon Rattle reaching out to a wider and younger public for classical music. This was to challenge the stereotype of high culture as an elite pursuit, rather in the way that Stephen Daldry’s film Billy Elliot (2000) told the story of a boy from a working-class background in north-east England who surmounts hostile incomprehension to realize his ambition of becoming a ballet dancer.

The Turner Prize at the Tate Gallery showed its annual capacity for generating a high degree of publicity, if not of popular approval, for work like that of Damien Hirst, notoriously using formaldehyde to preserve carcasses for public exhibition, or Tracey Emin, inviting inspection of her unmade bed. Funding for the arts showed keen concern for public access, as the name of the new Department for Culture, Media and Sport clearly proclaimed. The use of revenues from the new National Lottery, set up in 1994, to support the arts meant that by 2000 a total of £1 billion had been distributed by the Arts Council of England in more than 2,000 awards, ranging from such traditional recipients as the Royal Opera House to the new sculpture at Gateshead by Antony Gormley, ‘Angel of the North’, with its 54-metre steel wingspan visible for miles on the a1.

The National Lottery also became the source of funding for the Millennium Commission, charged with marking the year 2000 in a way that would signal a new era. It received over £2 billion, of which most was committed to an array of capital projects, large or small, commercially profitable or not. Conspicuously largest and, as it turned out, least profitable was the Millennium Dome. Much was demanded of it: to serve as the hub of a great exhibition like that of 1851 or 1951; to focus the world’s attention on the prime meridian at Greenwich as the hour struck for the new millennium; to reclaim a contaminated riverside site for urban regeneration; and to justify bringing the London Underground to this part of south-east London. Some of this was achieved, as the impressive extension to the Jubilee Line testifies, with its cathedral-scale stations from different architects happily uniting function and form in a stark modern idiom.

The fact was that the Dome looked better in the distance than close up, better outside than inside, and better in prospect (or hindsight; than at the moment it was designed to celebrate. Inheriting controversial plans and incoherent projections from the Conservative Government in 1997, Blair himself was responsible for deciding to press ahead regardless, raising expectations and raising the stakes in a manner that was to become his own trademark. The vast Dome – ‘a very big umbrella’, as its architect, Richard Rogers, described it – was a technological wonder, shimmering above the Thames. But equally big questions remained. What to put in it? How to make it pay? If the Dome confounded the worst-case prophecies by opening on time, it was at a price. With its costs spiralling towards a billion pounds, and with the number of customers falling well short of the wildly optimistic forecasts in the business plan, the Dome ended as a monument neither to business acumen nor to entrepreneurial flair.

Such attributes were more positively identifiable elsewhere. It was increasingly apparent that literature had become big business, reaching a new level of profitability. The Booker (later Man Booker) Prize for novels published by Commonwealth authors was promoted with increasing media attention as a competitive spectacle. A beneficial result was that writers of high calibre now enjoyed correspondingly high sales. The merits of many prize-winners were self-evident: A. S. Byatt’s Possession (1990) was a sensitively developed exposition of a Victorian literary puzzle; the Canadian Michael Ondaatje’s The English Patient (1992.) juxtaposed the settings of Egypt before the Second World War with wartime Italy in unfolding a poignant love story; and Pat Barker’s The Ghost Road (1995) completed a fine trilogy on the First World War, building on the relationship of the poets Owen and Sassoon. All of these v/ere later filmed with differing kinds of accommodation to cultural internationalism. Nor were these novels alone in using historical research to sustain their sense of period. Indeed the gap between fiction and history seemed to narrow with the increasing public appetite for both serious biography and other accessibly written historical works – especially those which were tied to television series, with their presenters duly finding fame and fortune.

Celebrity, it seemed, had a momentum of its own. Some celebrity authors, famous as cooks, footballers or entertainers, plainly did not write the books that were promoted on the strength of their authorship. In the long history of British ghost-writing, the late twentieth century provided a new chapter second to none. The results spoke for themselves in sales figures for bestsellers that eclipsed all previous records. If it was the magic of the media that sustained this process, it was fitting that the creator of Harry Potter, whose wizardry was itself celebrated in a remarkable series of children’s books, should emerge as the most successful British author, albeit one who actually did the writing herself. Indeed, J. K. Rowling had retyped her first manuscript in 1995, because, as an impoverished single mother, she could not afford to photocopy it. Yet by 2000 she had earned £35 million through Harry Potter, now with film rights tied in to a range of marketable products, and with book sales of 66 million worldwide. It was a story that even the tabloids could not have made up.

THE RULE OF LAW

It may be proverbially true that crime does not pay; but, in political terms, it often paid off for the Conservative Party. Just as social policy and the health service had long been assumed to belong to the left, so law and order was regarded as an axiomatically right-wing issue. Anxiety about rising levels of crime, real or perceived, fed calls both for stronger policing and for harsher sentencing. Liberal ideas about the need to understand the social conditions which fostered deviance and illegality were represented as extenuating wrongdoing when the real answer was punishment. Michael Howard, appointed as Conservative Home Secretary in 1993, tapped this vein of rhetoric, notably in his reiterated sound-bite: ‘Prison works’.

It was a defining moment in British politics when Howard discovered that the Conservatives no longer had this trump card up their sleeves. For it was in shadowing the Home Secretary that his young Labour opponent, Tony Blair, showed his impressive ability to finesse the Conservatives on their own ground. He came up with a slogan (actually coined by Brown) which balanced an acknowledgement that criminal activity had no excuse with a recognition that merely retributive responses were inadequate. As Blair first put it in a television interview in January 1993: ‘What we need is a proper national strategy for crime that’s both tough on crime and tough on the causes of crime.’ This was the mantra that made him famous, identifying the man elected as Labour leader within eighteen months.

It should never be forgotten that Blair was a lawyer before he was a politician. He was the son of a lawyer; his degree at Oxford was in law: he practised at the bar, which is where he met his future wife, Cherie Booth, another lawyer. Although Blair never rivalled her subsequent success in the legal profession, instead entering the House of Commons in 1983, the front-bench responsibilities that soon came his way also had a significant legal dimension: first in recasting employment and trade-union legislation and then – crucially for his advancement – in becoming Shadow Home Secretary.

There was, of course, much more to Blair’s political outlook than a legal perspective, still less one narrowly defined. His was an overriding concern with justice and the rule of law, and it framed many of the issues which enlisted his deepest and strongest commitment. This mindset may help to explain some of the central preoccupations of the Blair Government, as its priorities were disclosed in office: both what was done and what was left undone.

This was in many ways a new kind of Labour Government. Its opponents were disappointed in their prophecies that it would wreck the economy in a wild reversion to tax-and-spend policies; this picture of Old Labour, which the stance of the left itself had perversely fostered in the 1980s, was shown to be a caricature. Brown, a student of Labour history himself, emerged as a disciplined Chancellor, with an austerity of purpose that recalled Cripps and a prudence in public finance that emulated Jenkins. Having inherited a strong economy, he took measures to sustain its strength rather than imposing immediate strains upon it. The results made an impressive story by the time of the next general election (and one that this Government was not inhibited from telling). During the eighteen years under Conservative Government, 1979–97, annual econdmic growth had been 2.2 per cent; in the first four years under Labour, 1997–2001, growth averaged 2.6 per cent.

This was certainly no Brown boom, on the model of the Lawson boom or the Barber boom before it. Indeed, the promise to end the boom-and-bust cycle was integral to Brown’s stewardship from the out set. His first move as Chancellor, taking almost everyone by surprise, was to shed one of his own powers: control of base rate. Instead, on what was seen as the German or the American model, responsibility for monetary policy was put into the hands of the central bank, and the Bank of England accordingly established an independent monetary policy committee, charged with setting interest rates so as to keep inflation to a target of 2.5 per cent. This was the more credible since such levels of inflation were already within sight. Though Clarke had, in the recent past, guessed better than the Bank about the proper level of interest rates, giving the Bank independence was well received in the City. Above all, it was subsequently vindicated by a steady decline in inflation: so much so, in fact, that by 2002 fears of deflation began to resurface for the first time in half a century. By then, the Conservatives too had accepted the independence of the Bank.

Labour thus succeeded in living down its inflationary reputation. Brown was able to achieve this without having to sacrifice employment. He had personally championed a ‘Welfare to Work’ pledge, aimed at removing 250,000 young people from the unemployment register, and financed by a windfall tax on public utilities (which were currently in ill repute for exploiting their monopoly position for private gain. The pledge was fulfilled, in that the relevant numbers dropped, and whether this was specifically due to the programme or to the general prosperity of the economy was politically neither here nor there. The official unemployment figures, which had never fallen below 5 per cent in eighteen years of Conservative Government and had again peaked at over 10 per cent as recently as 1993, now fell in parallel with the inflation level. By the beginning of 2001 the unemployment figure of 2.3 per cent was at a level to rejoice even unreconstructed Keynesians. It was clear that workers were not pricing themselves out of jobs – and this was so even though Labour had, on taking office, accepted the European Union’s social charter, including a minimum wage, albeit at a level set with extreme caution. Again, by 2000 the Conservatives had abandoned their opposition to a Government measure that proved popular and viable, just as Labour in opposition had previously had to swallow many of Thatcher’s initiatives.

The more controversial side of Brown’s policy, perhaps not surprisingly, concerned its fiscal implications. By simultaneously accepting the current rates of income tax and the expenditure limits already set by the Conservatives for 1997–9, it might seem that Brown was pledged neither to tax nor to spend. This was not wholly the case. Indirect taxes were allowed to rise under this formula; Brown manipulated them ruthlessly, especially in financing a series of undramatic redistributive measures, of which his favourite was the working families’ tax credit. He was undeterred by Conservative taunts about ‘stealth taxes’. Taxation actually rose from 35 to 37 per cent of GDP and, since GDP itself was higher because of growth, this meant that Government revenues were further swelled, eliminating the budget deficit by 1998. Most of the fiscal surplus was applied to paying off significant amounts of public debt, thus reducing the claim of debt-servicing on future revenues. Brown’s fiscal rectitude was far from stealthy; he made prudence his byword; but it necessarily involved him in resisting the immediate claims of the spending departments.

Yet Labour had talked of its hopes for improvements in the social services and in health care. One election pledge was to cut the NHS waiting lists by 100,000. This meretriciously precise number arguably generated perverse pressures within overstretched hospitals, where performance was thus being monitored by one arbitrary yardstick, and one that they were expected to meet without corresponding increases in funding. Blair himself had, in a famous sound-bite of 1996, declared his three priorities for government as ‘education, education, education’. Again, a specific pledge to cut class sizes to thirty for children in infant schools (aged five to seven) not only proved difficult to meet in itself but, insofar as it became the top priority, created adverse displacement effects within the educational budget as a whole. That these commitments were not only expansive but expensive too became embarrassingly apparent. In Downing Street the buzzword was now ‘delivery’; but delivering the promised results needed more cash.

The stealth taxes may have served their turn at the outset; but their limitations were increasingly exposed. The most conspicuous trouble was the fuel-tax protest that swept across Britain (and much of Europe) in September 2000, when petrol stations ran dry as supply depots were barricaded by a motley alliance of farmers and lorry drivers, demonstrating against the rising cost of petrol. This was partly caused, of course, by the international oil price but partly too by successive increases in the ‘green’ levies, designed to discourage private motoring. The Government was temporarily hurt in the opinion polls – its only setback here – but faced down the increasingly incoherent protestors without conceding their demands. The incident paved the way for a more open debate, justifying taxation by the strategic objectives that it fulfilled. The fact is that, in its first term, the Blair Government did little to remedy a generation of underinvestment in the social infrastructure; it would need a second term to disclose a social democratic agenda that explicitly put fiscal resources behind plans to improve the educational system and the health service.

Conversely, the Government had no such inhibition in giving immediate priority to constitutional reforms that did not require large sums of money. Electoral reform was much discussed; but it proved, in more senses than one, to be a cheap promise, as the Liberal Democrats duly discovered. The commission on the voting system that the Liberal Democrat Roy Jenkins (Lord Jenkins) chaired in 1997–8 led nowhere, since its cautious proposals, built on the alternative vote, failed to enlist Blair’s own support, still less that of a majority in his party. The Government went no further down this road than introducing forms of proportional representation in elections for the European parliament and in the devolution measures for Scotland and Wales.

Devolution brought major changes in the constitution of the United Kingdom. The controversy over Irish Home Rule had, a century previously, become the major fracture line in British politics, redefining Conservatives as Unionists, much to their electoral advantage. In Scotland, indeed, the party was still called Unionist; its historic opposition to the creation of a Scottish parliament, however, was a major reason for its recent decline. In the 1997 general election the Conservatives won no seats in either Scotland or Wales, giving Labour a unique opportunity to settle the issue. It was all the more important for it to do so because it was haunted by memories of past failure: it was the stalling of Scottish devolution that provoked the fall of the Callaghan Government in 1979.

Timing was of the essence here. Learning from experience, the Government acted in the first rather than in the last year of a parliament. It exploited the degree of consensus in Scotland for a separate parliament to pass a measure devolving a wide range of powers, subject to approval in a Scottish referendum, including some powers for taxation, which required specific ratification in a second referendum question. The Scottish Nationalists were thus ready to join the campaign for a ‘yes, yes’ vote in the referendum, which produced a vote of 74.3 per cent for devolution itself and 63.5 per cent for the tax-raising powers in September 1997. These impressive majorities were in contrast to the narrow referendum result in Wales a week later, approving a scheme for a Welsh assemby, albeit with more limited powers. The institution of a proportional electoral system made coalition between Labour and the Liberal Democrats in a Scottish executive a likely outcome; this duly ensued; and it reflected their common interest in making devolution work. Though supported by the Nationalists as a first step towards their declared aim of independence, devolution could thus be seen as a new means of preserving the Union. Even the Conservatives found themselves co-opted into the new arrangements, since proportional representation served them well, given their weakened state in Scotland.

South of the border, the Conservatives were in a barely more robust state. |In succession to Major, who quickly resigned from the leadership, Clarke proved unelectable in a shrunken parliamentary party now more Europhobe than Eurosceptic. The Conservatives accordingly chose William Hague, at thirty-six the youngest of the former cabinet ministers: an able parliamentary opponent for prime minister’s questions, deploying both a forthright Yorkshire manner and the rhetorical tricks of the Oxford Union. His youth consorted with a more informal image, reinforced by the presence of his vivacious new wife; and he made efforts to shift some of the cultural prejudices of a party membership now with an average age at the threshold of retirement. Hague showed himself, however, more ready than Major to take up traditional right-wing attitudes, with the ‘bogus’ status of many asylum-seekers becoming a stock taunt against the Government. Crucially, Hague strengthened the Conservative line against Europe, now saying that entry to the Euro should be ruled out for the duration of two parliaments.

Though the Euro remained unpopular, none of this lifted the Conservatives in the opinion polls, where they were lucky to register support above 30 per cent. Uniquely, this Government lost not a single by-election (though the Conservatives lost one to the Liberal Democrats). The assumption that Labour would be re-elected was widespread and well founded, as the results of the 2001 general election were to confirm. If the Conservatives were now perceived as right-wing, this image was more in tune with the inclinations of the party faithful than with the mood of the country - not perhaps the young country of Blair’s inspirational speeches but one which responded to his leadership with acquiescence.

Blair may not have been presidential, as was often alleged, but he never even paid lip service to the idea that he was simply the first among equals. The prime minister’s office was increased in size, becoming a separate department of the Government in all but name. Its explicit aim was to control both policy and its presentation, just as New Labour had been run from the centre, keeping everyone ‘on message’. This was largely the doing of Alastair Campbell, as press spokesman, who became the prime minister’s voice in the media, performing a vital role in a series of crises, especially that in Kosovo.

Blair and Brown established an axis of authority at the heart of the Government, their only rivals each other. Their relationship was initially muddied by the problem of finding a suitable role for Blair’s protégé Peter Mandelson. Now a newly elected MP himself, Mandelson’ s ambition to be parachuted straight into the cabinet might, if fulfilled, have created embarrassment for his patron; but leaving him outside, in a curiously undefined role, also created embarrassment for Blair. There were press stories of ongoing internal feuds: between Mandelson and Brown, which was damaging enough, or between Blair and Brown, which was an appalling prospect for the New Labour project. As early as January 1998 Blair was forced into a fulsome affirmation of support for his Chancellor – ‘He is my Lloyd George’ – which inevitably evoked references to the fate of previous neighbours in Downing Street. Yet the partnership between Blair and Brown survived such speculation.

No other ministers were in this league. Certainly Mandelson himself failed to enter it, despite being given two chances to emerge from Blair’s entourage as a cabinet minister in his own right.1 This was a transition more successfully achieved by another Blair loyalist, Jack Straw, showing himself a minister who was capable of surviving at the Home Office with the robust approach long promised by New Labour. In opposition, Robin Cook, his roots in the unforgiving school of Scottish Labour politics, had shown himself a formidable parliamentary debater; but at the Foreign Office he stumbled, initially unsettled by a relentlessly publicized marital breakdown, and found it difficult to sustain the promises of an ethical foreign policy that he had rather freely distributed. Above all, Cook was increasingly eclipsed (like many predecessors) by a prime minister who hankered after a role as his own Foreign Secretary.

In the age of summit diplomacy, contact with the President of the USA was bound to be a priority for any prime minister. One of Thatcher’s last acts had been to encourage President Bush to take a strong line over the Iraqi invasion of Kuwait in the summer of 1990; but it then fell to Major to commit British troops to action in the consequent Gulf War at the beginning of 1991, as part of a coalition under United Nations auspices. The Gulf War had little of the political resonance of the Falklands episode. It proved to be a short conflict in which predominantly American high technology succeeded in releasing Kuwait, even though Saddam Hussein survived as Iraqi leader, free for the next decade to alternate cooperation and defiance over arms inspection by the United Nations.

This was the situation inherited by Blair. From the first he signalled strong support for President Clinton in forcing Saddam to back down: notably in February 1998 by securing a mandate from the House of Commons (opposed by a couple of dozen Labour backbenchers) for the use of force against Iraq. Six months later Blair supported Clinton’s use of cruise missiles in Sudan and Afghanistan, allegedly against terrorists, and Britain alone joined the USA in bombing Iraq at the end of 1998, invoking the need to uphold international law. Whether such moves subserved an ethical foreign policy, however, was disputable. Their timing more obviously subserved the attempts of Clinton to escape impeachment at the height of a sexually fraught scandal. At a time when he badly needed friends, he undoubtedly valued Blair’s support and was ready to reciprocate (especially over Ireland). If it became a stock charge against Blair – and not only on the left or among jealous European statesmen – that he was uncritically pro-American, he had helped foster this impression himself.

All the world’s a stage; and Blair trod it with developing confidence, both in his own rhetoric and in his own rectitude. Comparisons with Thatcher were sometimes made, but hers was essentially a nationalist perspective. Rather it was Gladstone, whose politics of conscience had been projected within an international dimension, who offered an exemplary model which, with initially uncertain steps, Blair showed signs of consciously emulating, whether in Ireland or in the Balkans.

As regards Northern Ireland, Blair was hardly the first prime minister in a quarter of a century of violent troubles to be shocked by the province’s descent into lawlessness. Under Thatcher, the British response had been resolute rather than imaginative, based on refusal to yield to terrorism, though resorting also to covert methods that were themselves illegal, as later became clear. The fact that the IRA had attempted her own assassination helped freeze attitudes in an embittered conflict which, as became apparent by the 1990s, neither side could win outright. It should come as no surprise that Blair, though tough on terrorism, determined also to be tough on the causes of terrorism. This meant recognizing that, although Unionists might sometimes be right in calling Nationalist opponents ‘lawless’, it was what the Catholic minority perceived as ‘injustice’ that fuelled support for the IRA. Perhaps Blair’s sensitivity on this point was sharpened by personal sympathies: his wife was a Liverpool Catholic, his children were sent to Catholic schools, and he often accompanied them to Catholic services.

Blair thus developed the more flexible approach pioneered by Major, who had first opened an informal channel of communication with the Nationalist forces, aided by a Government in Dublin that likewise sought some way out of this impasse. In December 1993 the two Governments had issued the Downing Street Declaration, worded with exquisite care so as not to offend historic sensibilities, but aimed at achieving a negotiated settlement. Such hopes were encouraged, eight months later, by an announcement from the IRA that it would suspend hostilities.

Not every dawn in Northern Ireland turned out to be false; but many did so in a peace process that was only to resume its faltering progress under Labour. The IRA ceasefire had meanwhile proved delusive, brutally ending with huge bomb blasts in London and Manchester in 1996. But Blair was determined on reviving a political alternative as prime minister, appointing a popular Northern Ireland Secretary, the extrovert and outspoken Mo Mowlem, whose unconventional initiatives helped to break the log-jam. An IRA ceasefire was restored; Dublin’s assistance was readily forthcoming; and American mediation also proved helpful, with Clinton’s intervention always a last card in Blair’s hand. By putting himself at the forefront of the quest for a settlement, like Gladstone and Lloyd George before him, Blair raised the stakes, not least rhetorically.

‘I feel the hand of history upon our shoulders,’ Blair said, on arriving in Belfast in April 1998. The crucial negotiations, at Hillsborough Castle, now brought ancient enemies face to face, during tense days and nights, with success hanging in the balance until the last moment. David Trimble, an unusually adroit politician for a Unionist leader, found himself striking a bargain not only with the moderate Nationalists of the SDLP but with the IRA itself, now sanitized as Sinn Fein under the leadership of Gerry Adams and Martin McGuinness, who were revealed as two accomplished politicians, intent on living down their violen antecedents. As they put it themselves, ‘There can be no agreement that will work without Sinn Féin.’ This left open the question of whether an agreement would work with Sinn Féin.

The Good Friday Agreement in April 1998, finally signed by all significant parties, provided for a power-sharing administration in Northern Ireland. Thus the Nationalist minority (overwhelmingly Catholic) could achieve justice through the rule of law; or so Blair hoped. But, while capitalizing on Mowlem’s rapport with the Nationalists, he also realized that the Unionists needed reassuring that sovereignty could be abridged only by freely given consent. Agreement on the package was endorsed by 71 per cent of those who voted in a Northern Ireland referendum. Trimble, however, designated as first minister of a power-sharing executive, refused to share power until Unionists were assured that peace would be lasting.

The trade-off seemed simple: if terrorists would irrevocably lay down their arms, they would be accepted as legitimate participants in an equitably remodelled political system. There were terrorists in the Protestant community, to be sure, to whom this applied; but everyone knew that the real crux was the position of the IRA. For nearly thirty years the provisional IRA had fought the British Government; to surrender its arms would be seen as an acknowledgement of defeat; the most that could be expected was ‘decommissioning’. Without tangible, verifiable signs of decommissioning, the Unionists were reluctant to accept the peace-loving credentials of former IRA activists, for all the ostensible commitment of Sinn Féin to political methods. In this context, each of the IRA’s incremental steps towards putting arms beyond use was successively greeted with Unionist dismay that the threat of violence had still not been removed by full decommisioning, as promised. Intermittent violence, notably the bomb at Omagh in 1998, planted by a splinter-group calling itself the ‘Real IRA’, gave credence to such fears. Only in November 1999 was a power-sharing executive at last in place, with devolved powers to follow. True, endemic difficulties recurred – including repeated suspension of the Assembly itself - with each crisis manifesting the failure of tentative IRA truce signals to match impatient Unionist expectations. The fact was, however, that the IRA did not resume its campaign, despite the maintenance of Partition; and that a political process meanwhile survived in Northern Ireland, with Sinn Fein’s adherence to it sealed by the political gains that it succeeded in extracting.

In Ireland, Blair showed himself both patient and persistent. Crucially, this was reinforced by a strategic commitment not always seen in other policies of his Government during its term of office. He was often a cautious prime minister, testing the water before committing himself. His greatest challenge remained that of defining Britain’s role in Europe. Here his instincts were to make the partnership work rather than to assert British peculiarity; but so long as the United Kingdom remained outside the single currency, there was inevitably some degree of marginalization. Opinion polls showed a majority unpersuaded about the Euro. As Chancellor, Brown established five economic tests that would have to be met before the Government would recommend its adoption. The first of these, that entry must be ‘good for Britain’, really embraced the rest; undeniably important, this set such a general criterion that it could hardly avoid a political as much as an economic judgement. The Blair Government thus deliberately left itself free in its second term to decide whether – and when – to call a referendum on the Euro.

Blair was a prime minister of natural caution, but one who could also show startling boldness once his beliefs were enlisted in a great cause. In office, he increasingly discerned such causes overseas, in a restless quest to put the world to rights. Above all, it was his moral populism that brought his own Christian convictions to bear upon secular problems with a passion that went beyond political calculation, even in such far away places as Kosovo, Afghanistan and later Iraq, where the need to restore law and justice made an appeal to conscience.

Kosovo was the crisis that established Blair as an international leader. The affairs of this unhappy province of the former Yugoslavia, supposedly autonomous from the Serbian successor state, had tangled historic roots which fed luxuriant propaganda claims on all sides. The record of the western powers in protecting the mainly Muslim population from aggressive Serb nationalism since Yugoslavia’s disintegration had been, at best, weak; at worst, despicable. Blair had no real engagement with the issue until 1998; but, faced with the continuing evidence of Serbian ‘ethnic cleansing’, his increasingly judgemental view of the conflict was openly proclaimed. ‘This is now a battle of good and evil,’ he told readers of the Sun in April 1999. When he later went to see Kosovo for himself, ‘criminal’ was the way he privately described Serbian treatment of refugees.

Blair thus became the most resolute champion of effective intervention by Nato, whatever the risks and whatever the costs, to save the innocent people of Kosovo from lawless oppression. ‘We have in our power the means to help them secure justice,’ he claimed, ‘and we have a duty to see that justice is now done.’ His message was the need for a concerted international response; many of his key statements were made overseas, notably when he visited the USA. Throughout the crisis he worked closely with Clinton, just as he had in Northern Ireland; indeed, the two crises overlapped. The Nato air strikes against the Serbs, begun in March 1999, were mainly an American responsibility; but Clinton showed an understandable concern for domestic politics in his wariness over extending operations to the use of ground troops. Yet ultimately it was the professed threat of intervention by Nato on the ground r-hat apparently produced the Serb capitulation after eleven weeks of bombing.

This was where Blair had forced the pace. Despite evidence of unease at this step in the British opinion polls, it was his decision to deploy British troops that helped persuade reluctant allies to contemplate doing likewise – and thus persuaded the Serbs to agree to Nato’s terms. Blair’s nerve had held despite many criticisms of the strategy itself and of the collateral damage it inflicted on non-combatants. What distinguished him was the sweeping way in which he defined the ends – the assertion of law and justice on an international scale – combined with a commitment to whatever means proved necessary. His tough-minded readiness to use armed force was backed by his own rhetoric in sustaining the necessary coalition in support. Blair’s leadership had risen to the challenge of Kosovo in a way that showed his own potential for shaping political outcomes. He had not been pushed into it by public opinion, nor was there direct electoral reward for his perceived success, achieved against the odds. After this, it missed the point to call Blair a puppet or poodle of the USA. Likewise, those who studied this crisis – the last of the twentieth century – ought not to have been surprised later by Blair’s response to the events of 11 September 2001 or his road to war in Afghanistan and Iraq.