‘What I really really want’
Churchill got his lucky number, but tomorrow there’s another.
Blur, ‘It Could Be You’ (1995)
The consorts, the lovers, the walkers, the alibis, the mistresses, the toyboys. Famous, or famous for being famous, or famous for fucking the famous, they were all there.
Terence Blacker, The Fame Hotel (1992)
Opinions are like arseholes – everyone’s got one.
T-shirt slogan (c. 1991)
Ever since the 1970s, Saturday evening television had been dominated by the family-friendly fare offered by BBC One, and in 1994 the channel secured the rights to broadcast what was hoped would be a major addition to its portfolio: the weekly draw of the National Lottery. In November Noel Edmonds presented the first programme to great national excitement and an audience of nearly twenty-two million viewers. Unfortunately, no one became a millionaire that first week – the jackpot was shared between seven winners – and the viewing figures soon began to slide, largely because it made for very mundane television.
There was another reason why fewer people were watching the programme. The year after its launch, ITV fought back with the quiz show Who Wants to Be a Millionaire?, hosted by Chris Tarrant, which became a huge success, reaching a peak audience of nineteen million, leaving the Lottery far behind. Its simple but effective format was exported to dozens of countries, far beyond the usual markets of Europe and the Commonwealth, so that by the early years of the new century it was said to be the most popular television programme right across the Arab world. Perhaps the Lottery show might have done better in the battle for ratings if it had followed the advice of Elizabeth Peacock, the kind of Conservative MP who could teach Tony Blair a thing or two about being tough on crime. ‘Flogging criminals live on television before or after the National Lottery draw will create a great impact,’ she argued. ‘The punishment should be done in public as a humiliation and to show others what will happen to them, and the National Lottery has a big audience to reach.’
Even without the added attraction of corporal punishment, the Lottery itself was instantly adopted as part of British life. Some two-thirds of adults bought a ticket in any given week, the average weekly spend being £2.05 per head of population, slightly less than half the amount spent on cigarettes. This was despite the absurdly poor odds: 14 million to 1 against scooping the jackpot, 54 to 1 against winning anything at all. ‘Someone told me a sobering fact,’ remarked a character in Peter Lovesey’s crime novel Upon a Dark Night.‘No matter who you are, what kind of life you lead, it’s more likely you’ll drop dead by eight o’clock Saturday night than win the big one. So I don’t do it any more.’ But Kenneth Baker, the home secretary who had launched the Lottery white paper in 1992, had suggested that it was ‘a chance for the government to introduce a little gaiety into British life’, and that aspiration at least was realised.
There were those who objected to the creation of the Lottery. Some had a vested interest in other forms of gambling that were bound to suffer, particularly the pools companies; Littlewoods curtailed its donations to the Conservative Party in protest, while Vernons reported an instant fall of 15 per cent in revenue. Others were concerned that charitable giving would be similarly hurt, a charge airily dismissed by the government. However, early findings suggested that discretionary donations to charity fell in 1995 and, although compensation came from the Lottery, so that the average household contribution remained unchanged from the previous year, the impact of having an additional layer of bureaucracy meant that actual spending was delayed. More significant was the existence of that bureaucracy, representing a transfer of power from individuals to the quango of the National Lottery Charities Board. This combined with a trend towards charities receiving cash directly from central and local government; Barnardo’s, for example, was by the middle of the decade reliant on the state for around 47 per cent of its income. By the end of the decade, the entire charitable sector had been transformed, almost beyond recognition.
There was a further concern that the state’s endorsement of gambling might prove detrimental to society. John Major, a keen enthusiast for the Lottery, argued that this complaint was misplaced and that a little flutter was a quite separate thing from serious betting; with such long odds, the game was ‘unlikely to attract the serious gambler’. That, however, somewhat missed the point. The number of people placing bets rose sharply as a result of the game’s introduction, and amongst the newcomers were some who became addicted to gambling, particularly after the launch of Lottery scratchcards provided an immediate thrill in the long days between draws.
Research published in 1998 showed that half of all children aged between twelve and fifteen had bought a scratchcard, despite being under the legal age to do so, and one report predicted that ‘within ten years gambling will be a bigger youth problem than drugs’. A suggestion by Camelot, the company running the Lottery, that it might have a draw every day was rejected by the regulator on the grounds that this would encourage excessive gambling, but some felt that the damage had already been done. By normalising the idea of betting, the Lottery had instilled in the nation a habit that would inevitably result in increasing numbers of addicts. Meanwhile spread betting was starting to become popular, and the first online casinos were being launched, promising a wild new frontier. By the end of the 1990s, spending on gambling had outstripped that on beer and wine, and nine-tenths of the population were placing bets, even if only occasionally, up from two-thirds at the start of the decade.
A less obvious consequence was remarked upon by Mervyn King, the chief economist of the Bank of England (and later its governor), who claimed in 1995 that the Lottery had ‘taken money out of the economy and is one of the reasons for the lack of a “feel-good” factor’. Such a belief was, of necessity, impossible to verify, let alone quantify, but an economist with HSBC Greenwell estimated that ‘between £250 million and £750 million worth of lottery spending would appear to have been diverted from retail spending’ as a consequence of that first full year of operation.
The defence of the Lottery, its justification as a social enterprise, was the money that it raised for what were always called ‘good causes’. When the idea of a lottery had first been floated in the late 1970s, by a royal commission on gambling chaired by Victor Rothschild, it was in response to a financial crisis that was hitting arts spending hard, and the suggestion was that this was the easiest way of raising money for endeavours that required subsidy. By the time the commission’s report was published, however, the government of James Callaghan was in its dying days, and its successor was disinclined to follow up the recommendations. Margaret Thatcher did look at the idea of an NHS lottery but rejected it with Methodist distaste: ‘I did not think that the government should encourage more gambling.’
With the arrival in office of John Major, the report was dusted off and swiftly put into practice, though the originally proposed recipients of the proceeds were expanded from the arts, heritage and sport to include charities and, an element that no one fully comprehended, a so-called Millennium Fund. These would receive 28 per cent of the money collected by Camelot, shared equally between the five areas of interest.
Since most of this money was to be distributed through various quangos, including the Arts Councils and the National Heritage Memorial Fund (the latter chaired by Jacob Rothschild, son of Victor), it didn’t take long to work out that there was a curious transfer of wealth in play here. The guaranteed winners were always going to be the government, which taxed the sale of tickets, and Camelot itself, as the singer Eddi Reader cheekily made clear when she was invited to start the draw: ‘It could be you. It’s definitely Camelot!’ But given the way the ‘good causes’ were structured, the other winners would now include those cultural institutions much favoured by the great and the good.
‘In public spending,’ explained David Mellor, when taking the legislation through Parliament, ‘one cannot expect the restoration of the Royal Opera House or the construction of a new opera house in Cardiff to take priority over the legitimate demands of the health service, and that is why the lottery was created.’ He also admitted that charities had only been included in the list of beneficiaries at a late stage, in an attempt to forestall claims that charitable giving would be hit by the Lottery. No one suggested that the people who actually played the Lottery might be given a say in how the proceeds were to be distributed.
If the recipients of the Lottery’s largesse were most likely to be those with sufficiently rarefied tastes in music, art and dance to appreciate the finer things in life, the contributions were disproportionately made by the working class, both in terms of actual amounts spent and, even more markedly, as a share of household income. A greater percentage of those in socio-economic classes C2, D and E bought tickets than those in classes A and B, and they bought more of them. As Julian Critchley summed up the arrangement: ‘The working class and the underclass are encouraged to spend in pursuit of unimaginable riches. The money raised is then spent on a series of middle-class good causes.’
This process didn’t always run smoothly. The first big purchase from the heritage monies was the acquisition for the nation of Winston Churchill’s papers in April 1995. It was supposed to be a populist gesture, evoking the memory of the great wartime leader in the run-up to the VE Day anniversary, but when it was discovered that the Churchill family was being paid £13 million out of the pot for ‘good causes’, there was a public outcry. It didn’t help that one of those cashing in was Churchill’s grandson, himself a Tory MP. NEVER HAS SO MUCH BEEN PAID BY SO MANY TO SO FEW was the Daily Mirror’s scathing headline, while the Independent called it a ‘startling redistribution of wealth from ordinary working people to leading Conservatives’. Following on from the £55 million donated to the Royal Opera House – where a ticket to see Luciano Pavarotti that year cost £267 – the publicity was appalling. ‘The lottery is our only hope. The working classes need hope,’ a Glaswegian woman told Nick Danziger, but she resented the consequences of her weekly flutter. ‘I’m not spending me money on lottery when they’re giving £58 million to opera, to the fucking toffs. I’ll pay for half a ticket, and I begrudge even that.’
Thereafter a major effort was made to improve the presentational side of the Lottery’s awards, with a much greater focus on promoting small community projects, even if, in reality, these received only a small proportion of the sums being made available. Further rehabilitation came with the incoming Labour government’s pledge to redirect some of the grants towards health and education, which was broadly welcomed by the public, though it was sometimes seen as merely a sticking plaster. Hope Park, the school in the television drama series Hope and Glory, had a beautifully equipped music room, thanks to Lottery funding, but no music teacher.
A major factor in the Lottery’s continuing success was the enthusiasm with which the tabloid press greeted big winners who agreed to have their names made public. This ready supply of good-luck stories helped to plug a gap that had been growing ever since the arrival on British news-stands of Hello! magazine in 1988 had outflanked the established print media, by offering uncritical attention and vast sums of money to those celebrities featured in exchange for exclusive coverage. Such offers were not restricted to international stars; in 1999 it was reported that James Major, son of the former prime minister, had sold to Hello! the photographic rights of his wedding to the model Emma Noble for a sum of £400,000. Unable to match such largesse, the tabloids struggled along with celebrities of ever greater triviality: footballers, soap actors, members of boy bands and girl groups, and pretty much anyone who’d been on British television yet wasn’t big enough to command a serious fee.
The diminishing returns of this culture were illustrated in 1997 when ITV’s show The Cook Report decided to investigate corruption in the pop charts, a practice so long established that as far back as 1964 Brian Epstein, the manager of the Beatles, had felt obliged to deny rumours that he’d hyped the group’s first single into the top twenty. To illustrate the sharp practices involved, The Cook Report set out to make a star of one of its researchers, 22-year-old Debbie Currie, whose mother Edwina was then fighting a doomed campaign to save her South Derbyshire seat. Debbie was given a remake of ‘You Can Do Magic’ (a 1973 hit for Limmie and the Family Cookin’), with vocals donated by erstwhile pop star Sinitta and production by Mike Stock of the Stock, Aitken and Waterman hit factory, and sent out to sell the record.
As an exercise in investigative journalism it looked a bit excessive – ‘the costs were huge,’ admitted the programme’s producer, David Warren – and was entirely undermined by the fact that awareness of the operation circulated in the music industry from an early stage. As an illustration of the media’s desperate desire for off-the-peg celebrities, however, it could hardly have been bettered. Currie was great copy, a Lycra-clad flirt ‘boasting of four-in-a-bed romps’ and proclaiming her intention of being ‘Sexy Spice’. In the days before the single’s release, she notched up forty-seven press interviews, twenty-six radio interviews and fifteen television appearances. She was also seen doing spots in sixty-two nightclubs, five festivals and one football match. The record may have sold only 400 copies, and might have stalled at number eighty-six in the charts, but briefly Debbie Currie was indeed a media star.
Almost as unsuccessful was the attempt to turn the supermodel Naomi Campbell into a pop star. In 1994 a novel titled Swan was published under her name, though it was doubtful how much she had contributed to the enterprise. (In Drop the Dead Donkey, Sally is horrified to find bad language in her own novel: ‘I knew I should have read it before it went to the publishers.’) To accompany the book came a single, a remake of T. Rex’s classic ‘Ride a White Swan’, which was promoted on Top of the Pops but still failed to get any further than number forty, while its accompanying album Babywoman (1994) sold just 240 copies in its first week. It seemed that there was a limit to the British public’s appetite for shoddy goods, however celebrated the purveyor.
To promote the themed book and single, Campbell took to wearing a dress intended to evoke the natural beauty of a swan. And at a press conference to launch the products, the comedian Paul Kaye, in his guise as Dennis Pennis, put to her one of his finest lines: ‘Hey Naomi, how come you’re dressed like a duck?’
Kaye’s character was designed as an irritant. ‘In a country which is largely governed by the celebrity party, I am the voice of opposition,’ Pennis explained on his television show, as he went about cheeking anyone he could find at publicity launches and premieres, from Joan Collins at the Venice Film Festival (‘You look like a million lire’) to Michael Howard (‘If you reintroduce capital punishment, will that just affect London?’). He was not the only one purveying this kind of prank television, adapting the format of Candid Camera and Beadle’s About to trick public figures into making fools of themselves. Other comics providing variations on this comedy of dramatic irony included Mark Thomas, Mrs Merton (initially, at least) and, towards the end of the decade, Sacha Baron Cohen, whose characters Ali G, Brüno and Borat all went on to star in their own movies.
The practice began, as did so much else, with The Day Today, where Chris Morris would interview, for example, the sometimes pompous Labour MP Paul Boateng and induce him to comment on a fictitious rap star: ‘Herman the Tosser is not someone who has invaded my own particular consciousness.’ In his own series, Brass Eye (1997), Morris took this to extraordinary lengths, persuading dozens of minor celebrities to support made-up campaigns to protect a Sri Lankan village from the effects of heavy electricity, to save an elephant in a German zoo which has its trunk stuck in its rectum, to thunder against a musical based on the Yorkshire Ripper, and to lead a fight against a new drug named ‘cake’. On this latter subject, Bernard Manning had the best lines to read (‘One young kiddie on cake cried all the water out of his body. Just imagine how his mother felt. It’s a fucking disgrace’), while the MP David Amess was prevailed upon to ask a question in the Commons about the fictitious drug.
If ultimately all that the tactic revealed was the willingness of the mildly famous to do absolutely anything that might keep them in the public eye, it did along the way result in some wonderfully absurd imagery, such as the former TV-am weather girl Tania Bryer delivering a nonsensical voice-over as though she understood what she was saying: ‘This is footage of the river Euphrates flowing backwards. It looks like it’s flowing forwards but only because we’ve reversed the film. Heavy water deeply confuses river flow systems. Just last month two rivers got completely lost and were found wandering uselessly about the southern oceans.’
The main point of Brass Eye, though, was to satirise and deconstruct the clichéd conventions of current affairs shows in the same way that The Day Today had tackled television news. A black studio guest would be captioned as ‘Representing every single black person in Britain’, and Morris would turn portentously to the camera after a film report to announce: ‘The situation is clearly grave enough to merit a black-and-white freeze frame.’
Much of the best television comedy of the decade was similarly rooted in playing with the medium itself, from Drop the Dead Donkey and Stephen Fry’s investigative journalist in This Is David Lander, through Vic Reeves and Bob Mortimer’s triumphant parody of a game show in Shooting Stars, to the obscure targets of some of The Fast Show’s best-known sketches: Country Matters, Jazz Club and Channel 9. Television was also at the heart of much of the era’s post-alternative stand-up. ‘I tell you what my comedy is,’ observed Eddie Izzard. ‘It’s pop culture stand-up. It’s all television. All the stuff I talk about, history, whatever, it’s from the History Channel, the Discovery Channel. It’s all from television.’ For a generation that had never known a home without a television set, had grown up watching the world through a screen in the corner of the room, the media had become the unifying culture.
Much of this comedy was essentially a warm embrace of television, particularly that of the recent past. Brass Eye, on the other hand, displayed a rage against the medium that could be genuinely disturbing. The programme’s peak came with the 2001 special ‘Paedogeddon’, which confronted the greatest of all tabloid subjects, and was broadcast at a particularly sensitive time.
In the summer of 2000 an eight-year-old Sussex girl named Sarah Payne had been abducted and murdered and, in response, the News of the World, under its editor Rebekah Wade, began a campaign to allow public access to the Sex Offenders Register, so that the identity and whereabouts of paedophiles released from jail could be known. The paper published an initial batch of the names and locations of forty-nine such offenders and announced that it would continue doing so until it had got through the entire list. This was clearly an implausible undertaking – there were 110,000 convicted paedophiles in the country and, at the rate the News of the World was going, it would take over forty years to identify them all – but although it was soon abandoned, in the interim the disclosure did considerable damage.
A wave of vigilante attacks took place, particularly on the Paulsgrove estate in Plymouth, and several people who happened to share names with those in the paper were attacked. Private Eye ran a cartoon of a man running from a mob, protesting that he was a paediatrician not a paedophile, but fiction was becoming increasingly difficult to separate from fact; the following week a paediatrician in South Wales woke to find that the word ‘paedo’ had been painted on the front of her house.
With such hysteria fresh in the memory, it wasn’t surprising that Brass Eye’s treatment of the subject attracted enormous hostility. Morris parodied pop music’s longstanding interest in underage sex with a 1970s glitter pop band singing ‘Playground Bang-a-Round’ and a white rapper, JLB-8, singing ‘Little White Butt’ (based on the old Tommy Steele song ‘Little White Bull’). He mocked campaigners whose definitions of abuse stretched beyond the credible (‘Today the number of children having sex with adults is beyond belief; if you define a child as anyone under thirty, the figure is over 86 per cent’). And he pinned the prurience of the media, introducing a case study with the words: ‘We believe his story is actually too upsetting to transmit. We only do so tonight with that proviso.’ He ended with a reassuring word of advice to the viewer: ‘Look, if a child does take your fancy, please remember: leave it a couple of years.’
In response to the broadcast, Channel 4 received bomb threats, Scotland Yard said they were considering an investigation on grounds of obscenity, and politicians queued up to voice their thoughts. Blunkett, now the home secretary, was on holiday and unfortunately missed the show but, having been told about it, decided that he ‘did not find it remotely funny’; Home Office minister Beverly Hughes said it was ‘unspeakably sick’, though it turned out she hadn’t seen it either; and culture secretary Tessa Jowell suggested that regulations might be changed to prevent such programmes being repeated.
Press reaction was confused. The Daily Mail said it was ‘the sickest TV show ever’, but still found room in its pages for photographs of the Duchess of York’s children, Beatrice (aged thirteen) and Eugenie (eleven), wearing bikinis, while the Star put one of its denunciations next to a photograph of fifteen-year-old Charlotte Church, and expressed its admiration of her figure as ‘looking chest swell’. Even those who might normally defend Morris were in condemnatory mood. It was ‘a deeply unpleasant piece of television that degraded children’, editorialised the Guardian. And in the midst of the controversy, the public figures duped into taking part were understandably furious, including the MPs Gerald Howarth and Syd Rapson, as well as Phil Collins, Gary Lineker and the radio disc jockey Dr Fox, the latter having proved eager to tell us: ‘Genetically paedophiles have more genes in common with crabs than they do with you and me. Now that is scientific fact. There’s no real evidence for it but it is scientific fact.’
Nearly a decade on from The Day Today, that episode of Brass Eye had taken provocation as far as anyone would be permitted in the foreseeable future. It represented the culmination of a style of comedy that had nowhere else left to go. The assault on the media’s acquiescence in the cult of celebrity had been relentless, but had proved difficult to sustain. Indeed the blurring of the lines between current affairs and celebrity trivia had arguably been exposed most convincingly not by Chris Morris but by ITN’s main news programme, News at Ten, which during the 1998 football World Cup, had opened a bulletin with the words: ‘The main news tonight is David Beckham’s apology for being sent off against Argentina last night.’ It was hard to know quite what satire could add to such a statement.
The need for cut-price celebrities remained, but in the wake of the television pranksters, such figures were now required also to develop a reputation as good sports or for having no sense of dignity. When Neil Hamilton was removed from the House of Commons by his constituents in 1997, he and his wife, Christine, promptly appeared as a double-act on Have I Got News for You – where they were handed their fee by Angus Deayton in a brown envelope – and embarked upon a media career that relied entirely on their lack of any discernible sense of shame. Every media appearance seemed to involve the heaping of humiliation upon them, and they cheerfully went along with the game, giving every impression of being a ‘macabre pair of attention-seeking mutants’, in the words of Andy Hamilton.
There were still plenty of news stories to be wrung from celebrities, but newspapers also required a more reliable source of human interest tales. Which is where the Lottery proved so useful, by suggesting that ordinary folk who had got lucky might be of interest to readers.
There was a precedent for such stories. The acquisition of sudden wealth by members of the working class had long been the stuff of fiction, from Gillian Freeman’s 1959 novel Jack Would Be a Gentleman to the 1960s American sitcom The Beverly Hillbillies. Real life had added the Yorkshire housewife Viv Nicholson, who won the football pools in 1961 and announced that she intended to ‘spend, spend, spend’, before doing just that and ending up penniless. Now the Lottery promised such human life dramas on a regular, possibly even a weekly, basis. In this context, it was no great surprise that a musical based on Nicholson’s life – titled Spend Spend Spend and starring Barbara Dickson – proved a hit with critics and audiences alike when it opened in 1998. (The theme of a working-class couple winning the Lottery itself was the subject of John Godber’s play Lucky Sods, which opened in 1995.)
Around the same time as the Lottery came a quieter development whose impact took longer to become evident. The series Sylvania Waters, a co-production by the Australian Broadcasting Corporation and the BBC, debuted on British television in 1993, and depicted the lives of a newly rich couple in the eponymous upmarket suburb of Sydney. Though little more than a cult success, it did suggest that the fly-on-the-wall documentary – which had hitherto been obliged to justify its existence by claiming sociological worthiness – might now be permitted purely as voyeuristic entertainment. A host of other shows followed in the middle of the decade, including most notably Airport (1996), The Cruise (1997) and Driving School (1997). Each of the three brought tabloid fame to one of their participants. Aeroflot employee Jeremy Spake went on to become a television presenter; cruise singer Jane McDonald achieved a number one album and a presenting career; and inept learner driver Maureen Rees bizarrely managed to have a minor hit single, with a cover of Madness’s ‘Driving in My Car’. By now such shows had been rebranded as docusoaps, an awkward coinage that nonetheless gave an accurate description of their content.
Alongside them came the genre of makeover shows. Interior decor had become a fashionable subject for magazines in the 1980s, with Interiors (later retitled World of Interiors) followed by the likes of Country Homes & Interiors and Elle Decoration, but only when the interest was applied to more humble homes did programme-makers consider it suitable for television. Changing Rooms (1996) was the most durable format, making stars of designer Lawrence Llewelyn-Bowen and carpenter ‘Handy’ Andy Kane as it brought together two couples to redecorate a room in each other’s houses. The appeal was partly that of seeing into other people’s homes, and partly the joy of the occasional catastrophe, when someone would express their distress about what had been done to their living room. The formula was then applied externally with Ground Force (1997), a series based on the improbable notion that a garden could be created in two days, and which introduced television viewers to the braless charms of Charlie Dimmock.
Somewhere in the mix too was the oddity that was Stars in Their Eyes, in which members of the public were given the right to appear on television, but only by imitating existing stars. Essentially an evening of tribute bands crossed with a grand karaoke competition, aided by professional make-up and costumes, it became for some a passport to a career. When the group Hot Chocolate needed a replacement for their singer Errol Brown, who had left for a solo career, they recruited Greg Bannis from the show. ‘We got him from Stars in Their Eyes,’ said guitarist Harvey Hinsley. ‘He was copying Errol, doing “You Sexy Thing”, and he was exactly like Errol.’ Perhaps too there was a connection to be made with the rise of the concept that stardom could be taught, with the opening of the Brit School in Croydon – sponsored by the British Record Industry Trust – and then the Liverpool Institute for Performing Arts. The impact of such academies was to become evident in the charts and on television in the following decade, producing artists as diverse as Amy Winehouse, the Kooks and Leona Lewis, as well as a substantial number of cast members in various television soaps.
In a separate, but related, development came the popularisation of the camcorder. The television series You’ve Been Framed (1990) encouraged viewers to send in humorous home videos, and was followed by the likes of Neighboursfrom Hell (1997), which itself span off into other themed shows, including weddings and holidays. The BBC, taking a more high-minded approach, launched Video Diaries (1990), in which people were given a camera to record themselves talking about their everyday lives. The format was adopted in other areas, including an advertising campaign for Superdrug in 1996, and a programme in which Amanda Platell documented the Conservatives’ 2001 election campaign, displaying little apparent interest in loyalty to the party that employed her. Meanwhile comedians Steve Punt and Hugh Dennis gave us the video diary of Samuel Pepys from 1665. More significant was the trend to include amateur footage in news programmes, a process mocked as ‘genutainment’ by The Day Today.‘Real events shot by chance on amateur cameras are increasingly putting real news crews out of business,’ announced Chris Morris, before introducing a segment titled ‘It’s Your Blood’. By the end of the decade, television channels were actively seeking such material.
All these strands – the docusoap, the makeover show, the video diary – were rooted in the idea that the lives of ordinary people could be made interesting for the public, a belief that led inexorably to reality television. The arrival in the summer of 2000 of Big Brother, a format developed in Holland, saw ten previously unknown people – all but one in their twenties – locked up in a house together for sixty-four days and nights, while viewers progressively voted for their eviction, one a week. When just three contestants were left, viewers voted for their favourite, who received a prize of £70,000. They were given tasks to perform, but this was not a game show, more a popularity contest, combined with twenty-four-hour video surveillance. It turned into Channel 4’s greatest ratings winner, with a final-night audience of ten million and nearly eight million votes cast, although by that point the best-known contestant, ‘Nasty’ Nick Bateman, had already been evicted for cheating and lying. The story was covered even in the Financial Times.
‘It is a shocking truth,’ wrote Charles Kennedy; ‘more people have voted in recent Big Brother polls than voted in the European elections.’ That wasn’t quite true, since it assumed that every vote cast over the course of the show’s nine-week run came from a different person, and the final night’s tally was still two million short of the turnout for the 1999 European Parliament election, but something was stirring here, and it demanded attention. As voting in elections declined, the opportunity to be consulted in the field of light entertainment was growing enormously. There was some anguish over what this might mean, particularly when Big Brother spawned dozens of sequels and imitators over the next few years.
These reality shows, as they became known, were joined in 2001 by Popstars and Pop Idol, essentially old-fashioned talent contests which brought at least temporary fame to the group Hear’Say in the former, and to Will Young and Gareth Gates in the latter. Again huge voting figures were recorded, as the public revelled in its chance to choose its own stars even before their careers had started. There was nothing new about the concept, which relied on telephone voting, just as had Bob Monkhouse’s Bob Says Opportunity Knocks in the 1980s, except for the noise and the hype attached to them. New Faces and Opportunity Knocks had never been able to command newspaper front pages. Nor had the contestants’ life stories mattered in the same way, for now talent was no longer the sole criterion on which they were judged; to be successful on such a show required also a tale of overcoming obstacles and hardship in pursuit of a dream.
And meanwhile there was the steady, inexorable rise of the internet, with its promise of ever greater democracy in culture. ‘Whether or not we want it,’ wrote the journalist and musician Emer Brizzolara in 1995, ‘we are going to have access to the words and music and art of Joe and Janet Average.’ These were still early days, when CompuServe, then the leading internet service provider in Britain, was pleased to be registering 1,500 new customers a week, but already the new technology’s potential use as a marketing tool was becoming clear. The American record company Capitol had shown the way forward when it set up a site to promote the album Youthanasia by Megadeth a month before its release in November 1994. The site attracted a million hits and enabled the album to enter the US charts in the top five, an unusual achievement for a heavy metal band, since the opportunities for exposure in the conventional media were so limited.
Many expressed scepticism about how important this new medium really was, some of them recalling the words of the great American philosopher Henry David Thoreau in the 1850s. ‘We are in great haste to construct a magnetic telegraph from Maine to Texas,’ he wrote; ‘but Maine and Texas, it may be, have nothing important to communicate.’ Others, however, were enthusiastic from an early stage. In particular, minorities who felt excluded from mainstream broadcasting found that the internet offered a means of direct communication.
In 1998 the former professional wrestler Jesse ‘The Body’ Ventura was elected as the governor of Minnesota for the Reform Party, using the net as a major campaigning tool. His victory was not matched by anything comparable in Britain, but there were opportunities to bypass the conventional media. ‘The BNP’s cyberspace audience is now far larger than the readership of all our printed publications combined,’ excitedly reported the British National Party in 1999. ‘In terms of reach and impact, our internet operation has already far exceeded the highest hopes with which it was launched three years ago. And the best is undoubtedly yet to come.’ By that stage, eight million people in Britain had access to the internet, more than in any other European country.
Perhaps the first British politician to become an advocate for the medium was John Redwood. In April 1995, while still Welsh secretary, he announced plans to connect all primary schools in Wales to the net, recognising that this was the future: ‘Our children have no doubt. They are dancing to the tune of cyberspace enthusiastically.’ A few months later, Tony Blair unveiled a proposal to do the same for schools throughout the United Kingdom, should Labour be elected. All parties became increasingly keen to lay claim to what was then routinely described as the information superhighway, though – just as the Coronation in 1953 and the marriage of Charles and Diana in 1981 had given huge boosts to television and video recorders respectively – the biggest single spur was again a royal story. The BBC had covered the 1997 election online, but it was with Diana’s death that the corporation really became aware of the possibilities; within weeks it had announced that it was setting up a full online news service, which rapidly became the most visited British site.
Part of the appeal for politicians was the dynamic that the internet seemed to offer for the economy, opening a space for entrepreneurs and small businesses to establish themselves, at a time when the tendency was towards ever larger companies. Once, the phrase ‘working online’ had referred to the production line of a factory; now, in a period of prosperity and affluence, internet start-ups were the new pioneers of capitalism, attracting absurd valuations and inflated share prices on the world’s stock markets in what Private Eye derided as SouthSeaBubble.com. It was unsustainable and, in March 2000, the bubble duly burst. The internet itself, however, continued to grow, its full impact as yet unknown in the fields of either politics or commerce. It was uncertain, for example, whether Britain would be attracted to the concept of buying goods off a screen; it had taken the American television shopping channel QVC five years to turn a profit after its 1993 launch in the UK, and the short-term prospects of internet shopping were far from rosy. Amazon, one of the genuine successes, was sufficiently realistic that when it launched in 1995 (its British branch followed in 1998), it made clear its expectation that no profit would be recorded for several years.
Instead, as Brizzolara predicted, the real appeal of the net in those earliest days was the chance it gave everyone to have a voice, whether to promote their own work or simply to express their opinions. The latter function had previously been filled primarily by radio phone-in shows, which had started in 1968 with the first BBC local radio stations and had since grown relentlessly, taking a firm hold in the mid-1990s. BBC Radio 5, launched in 1990 as an awkwardly hybrid speech station, offering children’s and educational programmes as well as news and sport, was reformatted in 1994 as Radio 5 Live, with a much stronger phone-in element to its daytime magazine shows.
The following year Talk Radio UK, the first commercial national speech station, was launched, promising to bring the American tradition of controversial shock-jock broadcasting to Britain, though the fact that its presenters included Dale Winton and Simon Bates suggested that it wouldn’t be quite as confrontational as that implied. For a couple of years it specialised in phone-in shows on current news topics hosted by the likes of Terry Christian, David Starkey, Peter Hitchens and Chris ‘Caesar the Geezer’ Rogers, before being bought up and rebranded as Talksport in 2000. Meanwhile Radio 4’s Any Answers, which had previously relied on letters written by listeners in response to what they had heard on Any Questions, was changed to become a phone-in show.
‘Opinion-making is this country’s most virulent growth industry,’ the playwright John Osborne raged in 1993. ‘Phone-ins proliferate, choked with calls from the semi-literate, bigoted and barmy.’ Osborne died the following year, but it’s probably safe to assume that he wouldn’t have much approved of the internet, which at times resembled a giant phone-in show, without the selection process or the mediation of a host. Frequently ill-informed, inaccurate and intemperate, it nonetheless had a vitality and energy that many found irresistible.
And it was in tune with trends in the media, for in the 1990s the balance seemed to shift decisively towards the first half of C.P. Scott’s famous formulation ‘comment is free, but facts are sacred’. As novelist Christopher Brookmyre put it: ‘The success of popular reporting since the eighties had lain in the practice of massively increasing the ration of column inches to facts. Facts were both expensive and time-consuming to procure, so you had to use them as sparingly as possible.’
There had long been columnists whose stock-in-trade was commentary on events rather than straight reportage, but the slow withering of the print media was accompanied by a rise in the perceived value of star writers. Considered capable of building loyal fan bases, the likes of Richard Littlejohn, Suzanne Moore, Garry Bushell and Tony Parsons were transferred between titles in the manner of professional footballers. Bushell and Parsons had come from the opinionated world of the late-1970s music press, as had Julie Burchill, Parsons’s erstwhile wife, who moved, in her own account, ‘from enfant terrible to grande dame, with nothing in between’. Her writing epitomised the best and worst of the genre, capable of entertaining and irritating at the same time, and couched in a flashy, gimmicky style that frequently spilled over into self-parody. There was also a tendency to play fast and loose with evidence when an argument needed bolstering, with mixed results. Burchill’s denunciation of John Lennon – ‘the tosser was at art school in the early ’50s’ – didn’t quite tally with the fact that the future Beatle didn’t leave secondary school until 1957, and her suggestion that George Orwell, who died in 1950, was ‘working for the CIA all through the ’50s’ was no more convincing.
The internet was not noticeably more conscientious when it came to checking facts. On the other hand, it had no obligation to be so, and no editors and sub-editors to keep it on the straight and narrow. ‘The internet is a subversive, anarchic, individualistic arena,’ wrote Ed Vaizey, another future Conservative MP who worked in public relations and journalism before becoming a political adviser. ‘It is a fundamentally Tory medium, promoting freedom, individual choice, and reducing the role of state bureaucracy to a minimum.’
That inherent conflict between the net and the state would come to be one of the world’s great political issues in the new century, and – as might have been predicted – in Britain it was to start with scares over pornography. ‘The internet is effectively the end of censorship,’ enthused one entrepreneur in 1998. ‘It is impossible to police, because there are millions of sex sites, with probably new ones starting up every few hours.’ To which there came the inevitable calls for control. ‘We must find a way to regulate it,’ demanded Conservative MP Andrew Rowe, warning about the dangers of ‘international arms or drug dealers, or those peddling pornography and encouraging paedophile rings’.
When, in 1995, Elizabeth Coldwell had written in Forum about the impact of computer technologies on the sex industry, she had focused on CD-ROMs and had concluded that, by comparison, ‘the internet is exciting, but it’s not particularly sexy’. Just three years later, however, it was being estimated that around three-quarters of all internet searches were in pursuit of pornography. Since most of the sites were located abroad, there was little that British legislators could do, though a start was made with the Criminal Justice Act of 1994 which extended the offence of possessing indecent photographs of children to computer images, including those held on a hard drive after accessing them on the internet. A decade later, this was further extended to include the possession of ‘extreme pornographic images’ which were ‘grossly offensive, disgusting or otherwise of an obscene character’.
Meanwhile, since America was leading the way in cyber-porn, there was a rise in American imagery and a shift in international taste. At the beginning of the 1990s the existence of a British magazine called Shaven Ravers indicated the daring novelty of pubic shaving; by the end of the decade, the practice had become the norm in pornography and increasingly commonplace among the general public.
There was, however, a British reaction to the shaved, pneumatic imports. This built on what was known as the white-panty subgenre, a trend towards amateur models evident in magazines such as New Talent and Amateur Video – the latter came with a videotape cover-mount – which featured the middle-aged (‘old and bold’) as well as the ‘young and fresh’. A spate of magazines such as 40 Plus similarly focused on older models, while MIA Video gave us the character Ben Dover (‘the randy-cam reporter’), who helmed a series of pornographic videos with amateurs.
The internet was the obvious home for such material, the perfect vehicle for specialist tastes. It also allowed the spread of information about the practice of dogging – meeting in laybys for sexual encounters with strangers – which became a heterosexual equivalent to the time-honoured gay customs of cruising and cottaging. It turned out that there were a great many Joes and Janets Average who wished to share more than their ‘words and music and art’.
The essence of the internet was its sense of democracy and, in retrospect, the fact that Nick Bateman was the first reality TV villain came to look ever more appropriate to this new world. He was the oldest contestant in the first series of Big Brother, and equally distant from the others in terms of class background. An ex-public schoolboy who had attended Gordonstoun at the same time as Prince Edward, he had gone on to work as an insurance claims broker (he put his deceit on the programme down to his time working in the City, where ‘people will stitch you up to further their own careers’). He was, in short, posh, and was referred to in the newspapers as ‘posh Nick’ before the appellation ‘Nasty Nick’ took root.
The use of the word was revealing. At the start of the decade ‘posh’ was seen as a slightly dated term, but in the 1990s it enjoyed a huge revival. Some of its popularity was clearly attributable to the existence of Victoria ‘Posh Spice’ Adams, but even discounting references to her, there was a nearly threefold increase in the word’s usage in newspapers like the Guardian and The Times during the 1990s, as compared to the previous decade. In the tabloids, where anti-elitism and monosyllabic brevity was of the essence, it was more extensively employed still. And the definition of what was actually deemed to be posh widened considerably. The fact that Adams herself was given the nickname, when she came from a comfortable middle-class family and attended a state secondary school, indicated that it was no longer to be restricted to the highest echelons of society.
None of this represented a sudden change. Much of Britain’s post-war history had been the story of crumbling privilege, from criticism of the establishment in the 1950s, through the rise of a pop culture aristocracy in the 1960s and the political power of the trade unions in the 1970s, to Margaret Thatcher’s assault on the professions in the 1980s. The result was supposed to be the classless society promised by John Major and enthusiastically endorsed by Tony Blair and New Labour, a new meritocracy in which one’s origins and family background would count for nothing. Whether that had been achieved was highly questionable when it came to consideration of real power, privilege and money, but certainly there was a cultural embrace of the concept, and the accusation of being posh came to be a damaging criticism, at least in the world of tabloid editors and commissioners of television programmes. Stories began to appear of a reverse discrimination: Ben Fogle, who became a star on an early reality TV show, Castaway 2000, was said to have been earlier turned down as a children’s television presenter because ‘his accent was too posh’.
The taste now was for the everyday, a celebration of normality in a democratisation of culture that was one of the more pronounced features of the decade and one of its key legacies. There seemed something entirely appropriate about the fact that one of Margaret Thatcher’s last acts, before handing over the premiership to John Major, was to approve the appointment as the new Archbishop of Canterbury of George Carey, a man who had failed the eleven-plus and had left school at the age of fifteen. While both Major and Carey remained unusual figures as outsiders in senior establishment positions, they did represent, in the modesty of their origins, a culmination of a process that had begun some decades earlier.
At a less elevated level there was the example of Chris Evans, the most talked-about broadcaster of the decade, who moved from playing records on the London radio station, GLR, to presenting The Big Breakfast and Don’t Forget Your Toothbrush on Channel 4, until – at the height of Cool Britannia – he was hosting both TFI Friday on television and the Radio 1 breakfast show. His success was based on his sheer ordinariness, his immersion in mass culture, though with fame came complaints that he had lost contact with normality; he was ‘the media’s playground bully’, in Tony Parsons’s words, or, as Luke Haines put it, ‘a shallow, bullying man-child, a jumped-up kissogram-turned-light-entertainment colossus’.
But, if there was a presiding spirit of this democratic age, it was made flesh in David Beckham. The son of an East End kitchen-fitter, he underwent a remarkable transformation, becoming a figure who commanded innumerable front pages and was capable of displacing world events from the news bulletins. Initially, of course, his success was based on his sporting prowess, but that in itself was never going to propel him to stardom, for he was a talented, rather than a supremely gifted, footballer. Asked once whether Ringo Starr was the best drummer in the world, John Lennon had joked that he wasn’t even the best drummer in the Beatles. By the same token, it could be said that Beckham wasn’t in the top three of Manchester United’s midfield four in the late 1990s; Roy Keane, Ryan Giggs and Paul Scholes were all more highly rated players.
But Beckham was by far the most media-friendly, both on the pitch – where his contributions featured spectacular free kicks that were football’s equivalent of the soundbite, tailor-made for news clips – and in interviews, where he came across as an agreeable, if not overly intelligent, young man. It didn’t hurt that he was also as photogenic as a pop star and paid sufficient attention to his image that he could make the tabloid front pages by changing his hairstyle or being photographed in a sarong. There was a brief stutter when he became a national hate figure after being sent off in a 1998 World Cup match against Argentina (hence that News at Ten headline), a fall from grace which reached its nadir when the Daily Mirror unpleasantly printed an image of his face on a dartboard, but the following year he married Victoria Adams and his elevation to superstardom was complete.
Despite his beauty, his athleticism and his wealth, Beckham’s appeal lay in the glamour of normality. In keeping with other 1990s British sports stars, he was no kind of a rebel, and seldom caused trouble for the football authorities. He didn’t even get into trouble for drinking. Rather he was a dedicated professional, was happily married and proved to be an exemplar as a father. His genius lay in transforming that slightly dull, boy-next-door image into an internationally marketable brand, imbuing it with the trappings of charisma and glamour. It was no coincidence that he became the most famous person in the country soon after the death of Princess Diana, who had achieved much the same trick of combining a common touch with the exoticism of fame.
It was faintly incongruous that this democratisation of popular taste was taking place at a time when social mobility was in decline, a development symbolised by the fact that the country was now being governed by the first prime minister for thirty-three years to have been to a public school. ‘The class war is over,’ Tony Blair told the Labour conference in 1999, echoing the thoughts of his Old Etonian predecessor, Harold Macmillan, forty years earlier in the wake of the 1959 general election: ‘This election has shown that the class war is obsolete.’ But despite Blair’s best efforts, the gulf between a powerful elite and an increasingly proletarian mass culture suggested that this was not the case.
Even in Westminster politics, the issue of class could never quite be driven away. When, in 2000, Betty Boothroyd stepped down as Speaker of the House of Commons, convention suggested that her replacement should be a Conservative MP, since the two main parties traditionally took turns. But Labour MPs had other ideas and Peter Snape, in proposing his colleague, Michael Martin, broke further with tradition, attacking the Tories by making a great deal of Martin’s Glasgow background as a sheet-metal worker who had, explained Snape, gone to his first job ‘in an old boiler suit and a pair of boots’. The intention was to draw a class distinction between Martin and his chief rival, the Conservative Sir George Young, an Old Etonian sixth baronet. Martin also had a more direct appeal to backbench MPs, pointing out that his career had been as undistinguished as most of theirs: ‘I’ve never stood to be a whip, a frontbench spokesman or a minister. But come to think of it, nobody ever invited me.’
The continuing awareness of class differences was not restricted to the back benches. In May 2000 Gordon Brown was speaking at a trade union conference when he raised the case of Laura Spence, an A-level student from a state school in Tyne and Wear, whose GCSE results and predicted grades were sufficient to get her into almost any university, but who was turned down by Magdalen College, Oxford and instead was intending to go to Harvard. This was the result, said Brown, of ‘an interview system more reminiscent of an old boy network and the old school tie than genuine justice for society’. He added: ‘It is about time we had an end to the old Britain, where all that matters is the privileges you were born with, rather than the potential you actually have. It is time that these old universities opened their doors to women and people from all backgrounds.’
Unfortunately, Brown had garbled the facts. He referred to Spence’s A-level results, when she had yet to sit the exams, and he failed to notice that of the twenty-seven applicants for five places to study medicine at Magdalen, all had comparable GCSE results, while three of the successful applicants were from ethnic minorities and three were women. It wasn’t quite so clear a case of an ‘old boy network’ as it seemed from the initial newspaper reports, which appeared to be the only information from which Brown was working.
The speech generated a huge amount of press coverage. Some of it was encouraging – ‘The chancellor really is talking our language,’ said the Sun – though the broadsheets were far less favourable, criticising Brown’s ‘harsh and uninformed attack’. There was clearly an issue here. Only 53 per cent of Oxbridge students came from state schools, where 87 per cent of pupils were educated. But that disparity was not necessarily the result of an admissions policy, for applicants to Oxbridge consisted in roughly equal proportions of private and state pupils; in terms of applications, Oxbridge could legitimately claim to give slightly more favourable treatment to those applying from the state sector. It was also the case that the proportion of privately educated students at Oxford had fallen in the post-war years, so that by 1969 the independent schools accounted for just 38 per cent of Oxford students; the numbers only started rising again with the widespread closure of grammar schools, particularly during the period when Margaret Thatcher was education secretary.
Little of that detail was allowed to cloud the ensuing debate, which lasted for several weeks and split essentially along class lines. The chief accusation on the one side was of ‘elitism’ at the country’s top universities, and on the other of underperforming schools in the state sector. The former charge was curious in this context, as The Economist pointed out: ‘You might as well attack the England football team on the same grounds. Institutions which seek to select and foster the best are inevitably elitist.’ The real question, as Brown had correctly identified, was one of access, though he had offered no answers to it. Nonetheless, it was elitism that came to define the episode, with a belief in some quarters that Oxbridge was largely populated by, in the words of Richard Stott in the News of the World, ‘hordes of upper-class, public school-educated, stinking rich, thick aristocrats with far inferior grades’. The universities, according to Paul Routledge in the Daily Mirror, resented the way in which Brown was exposing ‘the dirty little secret that they prefer to give places to public school pupils’.
Brown himself was bewildered by the storm he had unleashed. He ‘went on and on,’ reported Philip Gould, ‘about how this could have happened, why it was that a simple sentiment should cause such a blast from the press. He simply did not understand it.’ But at a time when the gloss had been rubbed off the government by the passage of time and a series of difficult events, his comments – backed by supportive interventions from Robin Cook and John Prescott – provided the opportunity for a good old-fashioned political row. A leader in the Daily Mail was headlined NEW LABOUR, OLD CLASS ENVY, precisely the kind of coverage that Tony Blair didn’t want to see. And some of those around the prime minister couldn’t help wondering whether it was entirely coincidental that Brown, a graduate of the University of Edinburgh, should be choosing as his target Oxford University, where Blair had studied.
Nowhere, though, was the issue of class felt more strongly than on the issue of fox hunting, which had been a fashionable cause on the left for many years. Ostensibly this was a question of cruelty to animals, though few believed that to be the true motivation. Even in its radical manifesto of 1983, when Labour had promised to make illegal ‘all forms of hunting with dogs’, it had hastily added that shooting and fishing would not be affected, giving a fairly clear indication that there were serious cultural limits to the concern for animals. By the time of the 1997 manifesto, which promised merely a free vote in Parliament on hunting with hounds, this reservation was spelt out even more clearly: ‘Angling is Britain’s most popular sport.’ That was as good a reason as any to leave it untouched.
The reality was that hunting, unlike angling, tended not to be popular in those parts of the country where the Labour Party fared well, and even the surprise victories of the 1997 election extended the party’s reach only to the suburbs, not to the rural areas where the hunt was an intrinsic part of life. It was seen, from the left, as an upper-class pursuit favoured by toffs, and therefore more than ripe for banning. Indeed, the first attempts to do so had been made back in the days of the Attlee government.
The battle lines weren’t quite so fixed as that suggested, however. In the 1940s most of the senior figures in the cabinet – from Aneurin Bevan and Ernest Bevin to Clement Attlee himself – had supported the continuation of the practice. On the Conservative benches of the 1990s, Alan Clark was a passionate anti-hunt campaigner, while Ann Widdecombe lapsed into almost Blairite language to explain her position. ‘The scenes of a hunt are splendid,’ she said, ‘so splendid that they are all over my dining room curtains, but they are colourful scenes of Olde England, and in an Olde England, not in modern Britain, they belong.’ In any event, the nature of the hunt had changed, argued Henry Davenport in Drop the Dead Donkey: ‘Nowadays it’s all bloody awful actresses, scrap metal dealers from Essex and jumped-up media types. The fox doesn’t run from terror any more, but social embarrassment.’
Blair himself spoke in favour of a ban whilst in opposition, and voted in support of a failed private members’ Bill in 1992, but it was not an issue that he cared deeply about, and as the prospect of legislation approached, he started to have his doubts. ‘The more I learned, the more uneasy I became,’ he wrote in his memoirs, revealing that he really began to regret his commitment to a ban when he met the mistress of a hunt while on holiday and was swung by her arguments: ‘From that moment on, I became determined to slip out of this.’ But he was sufficiently sensitive to the mood of his party – and perhaps too to the million-pound donation that Labour had received in 1996 from the International Fund for Animal Welfare – that he did not make his opposition widely known.
Instead the government ensured that several attempts to take a Bill through Parliament were thwarted through lack of support and time. A ban would come, was the repeated promise, but not just yet. In January 2001 Blair chose to go on an inessential trip to Northern Ireland in order that he might miss a vote and thereby send a signal ‘to show that fox-hunting wasn’t high on his list of priorities’. A couple of months later, he was to protest in a television interview, with less than total honesty, that he had voted to ban hunting, but that the measure had been thrown out by the House of Lords.
Meanwhile the threat to hunting had prompted the formation immediately after the 1997 election of the Countryside Alliance, dedicated primarily to resisting any such moves. Its first demonstration in London in July that year drew 100,000 people and was attended by William Hague and Michael Heseltine, though the most powerful speech came from the Labour peer, Ann Mallalieu, a keen huntswoman who argued that the issue was one of freedom and ‘the tolerance of minorities’. In comments that were unlikely to make much impact on abolitionists, she declared: ‘Hunting is our music, it is our poetry, it is our art, it is our pleasure. It is where many of our best friendships are made, it is our community. It is our whole way of life.’
Even bigger demonstrations were to be called, reaching a peak of some half a million people in 2002. These were extraordinary events, their composition quite unlike other protests staged in London and, although the preservation of hunting was the central cause, they quickly became a focal point for a host of other grievances. ‘All country people share in the problems of closures of rural post offices, inadequate public transport, crippling petrol costs, diminishing village stores and high community taxes,’ wrote the former defence secretary, John Nott, of the 2002 march. ‘That is why five hundred thousand people came, and I doubt if there are a thousand rich among them.’
Hunting was the touchstone for the Countryside Alliance, as it was for what remained of the left. Perhaps, reflected Alastair Campbell, the attempt to ban the practice ‘would get blocked in the Lords and we could then put a middle way in the manifesto for the next parliament’. But there was no point of contact between the two sides, no formula that would satisfy opposing groups whose passion on the subject was unequivocal, no Third Way for Blair to latch on to, despite his undoubted wish to seek compromise. The best he could hope for was to delay making a decision, staving off the inevitable day of reckoning.
The parliament ended, as it had begun, with hunting still a perfectly legal pursuit. In March 2001 the Parliamentary Labour Party met to discuss the manifesto for the forthcoming election and, noted Chris Mullin, ‘A ban on hunting with hounds was easily the most popular issue.’ It was hard not to see in this singular enthusiasm for what was essentially a fairly trivial piece of class posturing a symbol of how far Labour had drifted from its radical roots.
The emergence of the Countryside Alliance articulated a dimension of regional as well as class conflict, illustrating the vast, and still widening, gap between the town and the country. Rural pursuits were now removed from the mainstream, a situation depicted in a 1998 episode of the BBC sitcom Game On, when a character whose principal reading matter is Loaded picks up a copy of Country Life and wonders: ‘Is this what posh blokes wank over?’ The incomprehension was perhaps equally marked on the other side. Those attacking the hunt saw themselves as fighting decadent privilege; those defending it saw themselves in opposition to metropolitan liberals. The latter was a theme that William Hague took up with some enthusiasm, believing that it offered a way forward for the Tories, a chance to break out from the party’s continuing slump in the opinion polls.
After their dreadful drubbing in the 1993 election, the Canadian Conservatives had staged a comeback in the middle of the decade by adopting the slogan ‘the common sense revolution’, repositioning themselves as the party of low taxation and individual responsibility, fighting the incompetent bureaucracy of government. The policies were largely drawn from the examples of Margaret Thatcher and Ronald Reagan, and resulted in a major victory in the Ontario election of 1995.
Now, having tried out various options including ‘kitchen sink Conservatism’ and ‘compassionate Conservatism’, Hague revived the same slogan. By experimenting with the inclusive New Labour model, and accepting the social and cultural changes in the country, he had failed to make any headway, and he felt that something new was needed. The shadow cabinet was still split between the wings represented by Michael Portillo and Ann Widdecombe, and if the former’s modernisation programme wasn’t connecting with the electorate, perhaps it was time to try the latter’s traditionalism. ‘If we could get the common sense revolution to stand up and walk around,’ declared Hague in 2000, ‘it would look like Ann Widdecombe.’ He knew he would be accused of ‘lurching to the right’, but the Countryside Alliance demonstrations made clear that Blair didn’t represent the whole nation; there was still a wellspring of opposition to be tapped. And in this campaign, the battleground was, inevitably, law and order.
Unfortunately for Hague, some of his interventions looked simply inept. The publication in 1999 of the Macpherson Report into the murder of Stephen Lawrence saw him articulate a familiar argument. ‘The liberal elite have seized on the report as a stick with which to beat the police,’ he thundered. ‘We will take on and defeat a liberal elite that has always given more consideration to the rights of criminals than the rights of victims.’ This alleged distortion of the justice system in favour of wrongdoers was a long-running complaint – despite Michael Howard’s promise to deal with it when he was home secretary – though the murder of Stephen Lawrence was hardly the best example. It was the police, rather than liberals, who had so comprehensively failed the victim and his family here.
More convincing, for many, was another case later that year: a 55-year-old Norfolk farmer named Tony Martin, who was charged with murder after he shot dead a sixteen-year-old burglar, Fred Barras, who had broken into his house. Martin lived alone and had been burgled on several occasions, and there was widespread support for his right to defend his property since the police were evidently unable so to do. The case was not quite so clear cut as first presented, however: the shotgun used had been held without a licence, and Barras had been shot in the back whilst trying to flee from the scene. But Martin had publicist Max Clifford working on his behalf, ensuring that much of the press coverage remained favourable. ‘It’s something people are really angry about,’ Clifford argued. ‘William Hague, who desperately needs something to improve his image, should get behind this because the support out there is huge.’
Hague did indeed join in the debate, promising that if he were prime minister, householders would have greater rights to defend their property without fear of such prosecution. His comments rattled Blair’s inner circle. ‘We need to appear more in touch with public opinion than Hague,’ worried Lance Price. ‘Having the right to take on burglars is a good populist issue.’ But even a prime minister keen to win public approval was unable to indulge in such grandstanding when a court case was imminent.
Meanwhile Hague was revelling in the attacks he faced from some quarters for playing to the gallery. ‘We’ve got the whole liberal establishment railing against me,’ he exulted. ‘It’s just what I wanted.’ Common sense said that people had a right to protect themselves, using whatever means they had at their disposal; it was only the liberal elite who got hung up on the belief that, in the words of the Guardian’s David McKie, ‘execution, official or freelance, is not a proper sentence for burglary’. Hague might have paused for further thought when the verdict was returned. Martin was convicted of murder, though this was reduced to manslaughter on appeal, when his defence argued that he suffered from paranoid personality disorder. He didn’t make an obvious hero, and public opinion – expressed through the focus group of a jury – was evidently unimpressed.
During the common sense revolution, as so often in recent times, homosexuality returned to the agenda of the Conservative Party. When he first became leader, Hague had made overtures to Torche, the Tory Campaign for Homosexual Equality, thereby arousing the fury of some traditionalists. ‘Why do I share a party with those who advocate sodomite marriage?’ demanded Norman Tebbit. But Hague had then been still in his inclusive phase. Now he seemed to have decided that gay rights were another concern solely of the liberal establishment, and when, in December 1999, the government proposed the repeal of Section 28 of the Local Government Act 1988, he insisted that the shadow cabinet stand united in opposition.
It was a pointless fight. Repeal was inevitable, despite some disquiet in Labour ranks – the chief whip, Ann Taylor, tried unsuccessfully to get a free vote on the issue – but Hague clearly wanted to send signals to the Tory heartlands. It did him little good, and he found himself obliged to sack Shaun Woodward, his spokesperson on London matters, for refusing to go along with the policy, an event that precipitated Woodward’s defection to Labour.
Nor did Hague’s stand make the slightest difference to the Conservatives’ position in the polls, for the reality was by now that few cared much about the subject. Leaders of various faiths continued to insist that this was a key issue – the Catholic Herald declared that the attempt to equalise ages of consent represented ‘a new low in this country’s slide into moral degeneracy’ – and there remained a handful who displayed a horrified fascination with homosexuality, but most of the population had quietly decided that they weren’t particularly interested. The battle had been fought and won, and Hague only made himself look absurd.
He also ensured that the ‘nasty party’ image lived on. The Conservative MP David Curry felt that his children, now in their twenties, should be natural Tories, but regrettably they weren’t: ‘They think the party is totally out of touch. All the stuff about gays is totally incredible to them. Like the British people, they may not think all that much of Labour, but there’s no way they’ll vote Tory. They think we’re a lot of shits.’ As one of his erstwhile colleagues, Michael Brown, observed in response to Woodward’s sacking: ‘Never mind the common sense revolution – just common sense would have done.’
That raised the more philosophical question of what constituted ‘common sense’. Twenty years on from Margaret Thatcher’s first election victory, there had been a change in the country, and what was now common was an attitude of tolerance. Hague should perhaps have noticed this, for in his own quiet way, he represented some of that change. At his first conference as Conservative leader, he had let it be known – to the horror of Thatcher and others on the traditionalist wing of the party – that he and his fiancée Ffion would be sharing a hotel room. They even lived together without being married. ‘The message is,’ wrote the Catholic journalist William Oddie, ‘that if the Tory party expects William Hague to lead them back to traditional family values, they had better think again.’
Earlier in 1999 the story had emerged of a Tory MEP, Tom Spencer, who was married with two daughters, even though he was gay and his wife was well aware of the fact. ‘We discussed my homosexuality long before we got married,’ he explained. ‘Part of our arrangement was that occasionally it was acknowledged that I would go away for the weekend.’ It all seemed perfectly civilised, and no one would have been much concerned one way or another, had it not been for the fact that on returning from one of those weekends, he was stopped by customs officials at Heathrow Airport and fined for possession of cocaine, cannabis and hardcore pornography. His bag was also reported to have contained ‘a sexual accessory and a large black leather suit, with waistcoat and hood’.
Despite this wealth of what should have been media-friendly material, however, the story made very little impact. Spencer stood down as a candidate from the forthcoming Euro-election, of course – being caught breaking the law for anything other than driving at a reckless speed was still a matter for resignation – but his tale disappeared from the papers and from public consciousness within a couple of days. And mainly it did so because neither Spencer nor his wife appeared particularly perturbed. ‘Of all sexual perversions, chastity is the most peculiar. I am not capable of it,’ he shrugged, in a paraphrase of a line from Aldous Huxley. Not even the previously reliable Richard Littlejohn was able to work up much froth of indignation. ‘If his missus can live with it, that’s up to her,’ he wrote, with weary resignation. ‘I’d rather not know.’ The Conservative Party, it appeared, was not exempt from the changing climate, and the days of sex scandals seemed to be receding.
Equally unshocking was Spencer’s other offence of using recreational drugs. Leading up to the 2000 Conservative conference, Ann Widdecombe wanted to announce that £250 million would be spent on an anti-drugs programme, but Michael Portillo, as shadow chancellor, refused to approve any such commitment. Undeterred, Widdecombe made drugs the centre of her conference speech, calling for a hard-line approach that included £100 on-the-spot fines for smoking cannabis: ‘It means zero tolerance for possession. No more getting away with just a caution.’ She was given a standing ovation in the hall, but few outside were convinced. The police responded by saying that her policies were impractical, Hague began to backtrack almost immediately, and even the Daily Mail was unimpressed: ‘Is she really serious about criminalising every spotty adolescent who tokes up behind the bicycle sheds?’
Worse was to come. The Mail on Sunday rang around members of the shadow cabinet to ask whether they had ever smoked cannabis. Seven said that they had, including some of the brightest hopes of the party – Francis Maude, Oliver Letwin and David Willetts – with Tim Yeo later joining the chorus. The willingness of so many to go on the record as having broken the law was surprising to say the least, and raised the suspicion that it was all a put-up job, that there had been collusion in an attempt to destroy Widdecombe’s career by making her a laughing-stock. It was, reported the Sun, ‘a calculated bid to damage the shadow home secretary’s image as a future Tory leader’.
Had such an operation been staged, the one group who stood to gain were those who supported Portillo. Certainly that was how Widdecombe herself saw it. ‘It was a spiteful attempt to damage me,’ she asserted. ‘They saw me as a rival and set out to damage me. They had done the same to others like Liam Fox and John Redwood when they looked like potential rivals to Portillo. I have no doubt about it.’ Whether it were with Portillo’s approval or not, the leading Tories did seem to be being picked off one by one, in a way that brought the party no short-term benefit.
Some of the change in attitude was the consequence of a concerted campaign against prejudice. The country’s governing and media classes, it was alleged, were in the grip of the cult of political correctness, though since the charge was made on such a regular basis by commentators in most of the national newspapers, by members of the royal family and by politicians of both major parties, it was hard sometimes to see quite where political correctness was so entrenched, save perhaps at the BBC. Certainly there was a determination on the part of broadcasters, politicians and press – in varying degrees – to strive to avoid causing deliberate offence to members of groups marginalised in the past, but arguably the underlying motive shared more with traditional values than it did with modish ideology: ‘I call it good manners,’ wrote Jon Savage, in the Guardian in 1994; ‘indeed, being socialised: thinking about people other than yourself
Nonetheless, tales of ‘political correctness gone mad’ were commonplace right through the decade (the first sightings of the phrase were in 1993), if frequently exaggerated. A characteristic news story in 1994 claimed that a school in Greenwich had changed the opening lines of John & Yoko’s ‘Happy Christmas (War Is Over)’, so that it now started: ‘So this is December, and what have you done?’ The assumption was that this was an act of political correctness, done to avoid offending those of other faiths. The school protested that this could hardly be the case since the rest of the lyrics remained the same (the chorus still ran ‘And so happy Christmas’); rather it was just an attempt to encourage pupils to reflect on the passing year. But the explanation received less coverage than the original story, and instead John Lennon was cited as an integral part of British culture that was being silenced by the loony left. (Since he had been regularly lambasted by the right for his blasphemy, drug-taking and support for the IRA, he would presumably have relished the irony, had he still been alive to enjoy it.)
Most of those commentators who denounced political correctness were also exercised by the slipping of standards when it came to bad language, though this concern was not consistent across the country. When, for example, the Broadcasting Standards Council conducted a survey in 1994 on people’s attitudes to swearing on television, it found marked regional differences: ‘bastard’ and ‘twat’ were regarded as strong swearwords by 59 and 44 per cent respectively of people in the North, but by only 35 and 13 per cent in the South. Meanwhile the ITC, which regulated the commercial channels, noted that scheduling changes attracted far more complaints than questions of taste and decency; the biggest source of grievance in 1993 was the cancellation of the Scottish soap opera Take the High Road.
In any event, protests about the coarsening of culture, and about the imposition of new taboos, were, for the moment at least, fruitless. The drive for change appeared unstoppable. It was not simply the work of a demonised elite, but a consequence too of the democratisation of the media, as the everyday became visible. People became more tolerant as they saw individuals just like themselves depicted, with all their failings, foibles and follies, on a daily basis, whether on reality television or the internet.
If there was an elite to be fought, it was rather to be found in the new class of politicians emerging on both sides of the party divide, whose only experience of life after university was in the bubble of Westminster think tanks and on the periphery of the media, before becoming political advisers and then MPs. The gap between this remote, self-contained, self-replicating group and a rapidly democratising culture did not augur well for the future. Nor did the fact that the political elite appeared enthralled by the spectacle of riches, at a time when wealth disparity was continuing to increase. For the moment, with a growing economy, that was not yet critical, but it did not look like a sustainable model for society in the longer term.
When comparisons were made between voting numbers on reality television and those in elections, the lesson generally drawn was of the declining state of democracy in the country. Sometimes an accusation was levelled that there was a causal connection between the two phenomena, that a combination of cheap, imported electronic goods and the material consumed via those goods constituted a latter-day bread and circuses. It was equally possible, however, to see the growing interactivity of the media as the first, inchoate stirrings of a new model of politics.
‘It may be that the era of pure representative democracy is coming slowly to an end,’ reflected Peter Mandelson in a 1998 speech, observing that the political elites that had dominated democracy for so long – including those in the intellectual, trade union and local council spheres – came from ‘an age that has passed away’. In the future, he argued, representative institutions would be ‘complemented by more direct forms of popular involvement, from the internet to referenda’.
It wasn’t an entirely new thought. Earlier in the decade a character in The Politician’s Wife had made the same point: ‘They’ll hold referendums the same way. From Strasbourg to the Senate, virtual reality giving virtual power to the people.’ What remained to be seen was whether – if this truly was the dawning of an era and if, as the education minister Kim Howells put it, ‘the days of the old political class are over’ – the new breed of professional politicians would have a role to play in it. For the present, there was merely an uneasy recognition in some quarters that things were changing and that the manifestations of that change were unpredictable.
In particular, it was unclear whether the new democratising spirit of the age was to be matched by a new level of educated opinion. Recognising the rapid growth in the use of computers, the British government had declared 1982 to be the year of ‘information technology’, and that phrase had become the default description of the emerging new world, gaining even greater prominence with the spread of the internet. The relationship between information and education, however, remained uncertain, and the celebration of ‘emotional literacy’ that some detected in the response to Diana’s death only added to the confusion. There was a suspicion that the elevation of opinion over fact might yet prove damaging. So although an anti-elitism was undoubtedly evident in the country, there was also a danger that the tendency would shade into a distrust of experts of all kinds. A survey of 8,000 adults by the University of Leeds in 1997 found that 83 per cent believed science created more problems than it solved.
The most acute example of the attitude came in 1998, with the publication in The Lancet of research suggesting a link between the MMR vaccination (given to small children to immunise them against measles, mumps and rubella) and the incidence of autism and inflammatory bowel disease. ‘If I am wrong,’ commented the paper’s author, Dr Andrew Wakefield, ‘I will be a bad person because I will have raised this spectre. But I have to address the questions my patients put to me.’ More than a decade later, The Lancet retracted the piece, and Wakefield was struck off by the General Medical Council on grounds of serious professional misconduct in his research.
By then, it was too late for some. Reporting of the controversy had adversely affected vaccination rates for several years after Wakefield’s piece, despite overwhelming medical opinion and scientific evidence that the MMR triple jab was not only safe but hugely important for public health; it had played a critical role in slashing the number of cases of measles by 95 per cent in the preceding decade. The inevitable consequence was that the incidence of the diseases had risen. The anti-MMR scare was fanned by a feeling that the medical establishment tended to ignore the concerns of ordinary parents, and by reporting in some parts of the media that was ill-informed and often ignorant of basic science.
Much of the coverage centred on whether Tony Blair had allowed his youngest child, Leo, to have the vaccination, a question which he steadfastly refused to answer, insisting that he wished to protect the privacy of his children. (The story of Leo’s conception at Balmoral had yet to be released into the public domain.) Since the real issue was one of social responsibility – by not vaccinating one’s own children, one helped to spread the risk of dangerous diseases – it might have been more helpful had he made his position a little clearer.
The same reticence was not evident in Blair’s other pronouncements. At times he seemed all too eager to comment on passing news stories, particularly if doing so enabled him to appear in touch with popular culture. In April 1998 he added his voice to the tabloid campaigns calling for the release from prison of Deirdre Rachid, who had been jailed after being unwittingly implicated in a fraud case. The fact that she was a fictional character, played by Anne Kirkbride in Coronation Street, didn’t seem to trouble him in the slightest.