IN THE BROAD cultural sense, democracy flowered in the West in the decade following Hitler’s defeat. All parts of society had endured the catastrophes of war. The idea that some classes had a greater claim than others to the fruits of peace was untenable.
In Britain, a great levelling began. A Labour government, elected by a landslide, laid out the new order. A welfare state was established, including a national health service. An expanded Education Act, along with many more of those one-shilling Penguin books, gave ordinary people a theoretical chance to better themselves; new national parks guaranteed their access to nature. Slums were levelled and (eventually) replaced. Meritocracy – notionally, at least – replaced hierarchy.
The pace of change was uneven. Hundreds of thousands of soldiers returned from war to find nowhere to live – my father was homeless for two years. But you didn’t have to share his plight to share his conviction that change was unstoppable. Even The Times (in a wartime editorial) had joined the consensus: ‘A new order cannot be based on the preservation of privilege.’ Those ‘Christian’ values for which the Allies had fought demanded a redistribution of prosperity and power.
This levelling was poignantly embodied by the demolition of hundreds of stately homes. Many had already been requisitioned to help the war effort. (‘The builders did not know the uses to which their work would descend,’ mused Evelyn Waugh’s fictional alter ego, wistfully contemplating such a scene in Brideshead Revisited.) Now, in the shared austerity of peace, grand country houses seemed embarrassing as well as unfeasibly expensive. Their owners recognised that they were on the wrong side of history.
Down came the crumbling piles, year after year. Yet one remained untouched. Somehow, the rising tide of expectations and opportunities stopped short of Westminster. A splash of new working-class MPs diluted the privilege a little in the Commons, but you could barely taste the difference. It was still a chamber of ‘masters’ (as Labour’s Hartley Shawcross allegedly described his party colleagues). Elected representatives – most, like Shawcross, well-educated and well-off – decided what needed to be done; the ‘gentleman in Whitehall’ – who, Labour’s Douglas Jay insisted, ‘really does know best’ – helped them to do it. The electorate remained outside, passively having it done to them.
In the House of Lords, there wasn’t even a veneer of democracy to justify the set-up. It would have been hard to find a better embodiment of ‘the preservation of privilege’ than the mostly hereditary chamber – yet, beyond some ill-tempered tinkering from 1947 to 1949, to reduce peers’ powers of delay, nothing was done in that first crucial post-war decade to correct it.
At the time, it didn’t seem a big deal. The big battle for democracy had been won. That, at least, was secure, and one day the benefits must trickle down through Parliament to those its members represented. That was the faith to which demobbed soldiers clung: things could only get fairer.
Time proved them wrong. Years turned into decades. By the end of the century, the trickle-down of power had slowed to a drip. New devolved assemblies offered opportunities for an expanding political class. The electorate seemed more alienated than ever. The rise in membership of single-issue campaigning groups and the collapse in membership of big political parties were contrasting symptoms of the same imbalance. Thanks to new communications technologies, voters were better informed about current affairs than ever before. Their say in significant decisions about government policy remained as minimal as it had been in the age of the horse and cart.
A new generation of politicians – born, like me, in the post-war baby boom – had achieved maturity and power by then. Getting filthy rich was back in vogue. Inside Westminster, people seemed more concerned with the rewards of power than with its duties. As for distributing power more fairly, attempts at Lords reform from the 1990s onwards reduced one fault in the upper chamber – the preponderance of hereditary power – but exacerbated another: its vulnerability to prime ministerial patronage. The Lords became more competent, but only marginally more representative.
Since then, even the drip has stopped. Yet beyond Parliament the world has remade itself, and the imbalance has begun to feel intolerable – and impossible to ignore. Empowered by social media, the masses clamour with growing impatience for a greater say in the day-to-day decisions of government, while a discredited political caste clings with diminishing confidence to its privileged role as the people’s exclusive proxy. Political discourse has become more inclusive. Government has not.
Which brings us back to democracy.
English-speaking members of the Allied forces who didn’t recourse to Churchill’s oratory to express their war aims could as easily have used Abraham Lincoln’s: they were fighting Hitler to ensure ‘that government of the people, by the people, for the people, shall not perish from the earth’. In the strictest sense, however, they were fighting not for democracy but for its representative variant: republicanism.
The American Founding Fathers were clear about the difference; so, from 1828, was Webster’s Dictionary. In a democracy, according to Webster, ‘the people exercise the powers of sovereignty in person’; in a republic, ‘the exercise of the sovereign power is lodged in representatives elected by the people’. The Fathers were unanimous in preferring the latter. Benjamin Franklin may never actually have characterised democracy as ‘two wolves and a lamb voting on what to have for lunch’, but that was the gist of the consensus. ‘Pure democracies have ever been found incompatible with personal security,’ said James Madison, who shared John Adams’s fear of ‘the tyranny of the majority’. Thomas Jefferson warned against ‘a government of wolves over sheep’. They chose a representative republic instead, and built checks and balances into their constitution to keep raw democracy in check.
Britain’s adoption of the representative model was more drift than choice. We had a Parliament of representatives long before we had universal suffrage. By the time the entire adult population had been enfranchised (in 1928), this model was synonymous with ‘democracy’. Edmund Burke’s 1774 definition of the representative’s duties (‘Your representative owes you, not his industry only, but his judgement; and he betrays, instead of serving you, if he sacrifices it to your opinion’) was retrospectively embraced as a jewel of Britain’s unwritten constitution. The supremacy of Parliament’s honourable members was unchallenged.
But popular democracy never stopped playing catch-up. Once the long battle for universal adult suffrage had been won (via grudgingly conceded landmarks in 1832, 1867, 1872, 1884 and 1918), the British people grew accustomed to their right to vote. Many then began to reflect on how else they might contribute to their nation’s governance; a few of these reflections later acquired a degree of reality. The UK lagged behind the US in this respect: the Americans didn’t have the same baggage of feudalism and absolutism to work through, and their systems of open primaries and town hall meetings have been allowing citizens to contribute meaningfully to politics at a local level for much of their history. The democratic opportunities that British voters eventually began to be offered were later (towards the end of the twentieth century) and narrower. The establishment of devolved legislatures for Scotland, Wales and Northern Ireland was more a widening of representative democracy than a direct empowerment of citizens; those who gained most influence were themselves career politicians. Citizens’ juries, pioneered in the US and Germany in the 1970s and 1980s and popularised in the UK by the Institute for Public Policy Research in the 1990s, were more obviously designed to bring ordinary people into the political process. Typically, such juries comprise between twelve and sixteen people, invited at random from the electoral register and convened for two to five days to deliberate on a particular issue, as an advisory aid to policy-makers. Well over 100 of these juries – depending on how loosely you define them – have been convened in the UK since 1997, dealing with such issues as health care rationing, educational reform, waste disposal, pensions, indecency on television, genetic testing and climate change. Tony Blair and Gordon Brown both showed bursts of enthusiasm for this kind of public consultation, and from 1998 to 2002 a 5,000-member People’s Panel provided regular feedback to Blair’s Labour government. But they have fallen out of fashion in the past decade (notwithstanding some well-received pilots organised, since 2015, by the UCL Constitution Unit’s Citizens’ Assembly project). Their decline may reflect a perception that, in reality, these are little more than glorified focus groups: ‘consultative bodies, not decision-making ones’, as the IPPR’s Guy Lodge and Rick Muir put it in 2007 when urging the Brown government ‘to avoid giving the impression that citizens’ juries are an example of “giving away power”.’ The decisions that matter – not least about which issues to put to juries and what to do about their verdicts – have continued to be made in the usual corridors of power.
That may be one reason why, during the past twenty-five years or so, there has been a precipitate decline in public engagement with the electoral process. Voter turnout in general elections fell by nearly a quarter between 1992 and 2001, from 77.7 per cent to 59.4 per cent. It has since recovered slightly (to 66.1 per cent in 2015 and 68.7 per cent in 2017), but the message remains clear enough: for many people, being allowed to vote in a first-past-the-post system once every five years doesn’t feel as empowering as it once did. (And if you’re anything like me – nearly forty years of voting without once having seen a candidate I voted for elected – you may not have found it all that empowering in the first place.) Hence the hunger with which the public has grabbed at each small concession that has been offered since – from the Conservatives’ repeated experiments with open primaries to Labour’s adoption of ‘one member, one vote’ for its leadership elections. Meanwhile, the Recall of MPs Act (2015) has given voters a way of ejecting delinquent representatives between elections, and the creation of a Commons Petitions Committee and e-petitions website – also in 2015 – has empowered the general public to draw issues to the attention of Parliament and, with sufficient support, to prompt Commons debates. More than 10 million unique email addresses were used on the site in its first year.
The trouble with all these power-sharing processes is that, with the exception of Labour’s rule change (widely deplored in Westminster as a disastrous error), they have failed to deliver real power to the mass of ordinary citizens on the issues that matter to us most. That’s not to deny the benefits of initiatives that increase public participation in politics; but the big picture has barely changed. Elected representatives, and the party machines to which they are answerable, remain in day-to-day control. The representatives and parties would argue that this has been just as well: the problems of government are intractable enough as it is, without the added complication of the public’s mood swings. Perhaps they are right – but the public see it differently.
When the social media revolution came, a new kind of politics was unleashed. No voice, no matter how unruly or ignorant, is now excluded from the nation’s political discourse. The formal mechanisms of power remain as they were, but the reality is new: if the popular consensus is big enough, and the voices are loud and angry enough, elected representatives can be influenced from below on a decision-by-decision basis. ( Just look at David Cameron’s contortions when the public suddenly started taking an interest in the refugee crisis in September 2015; or consider the effect of social media outrage on the unaccustomed seriousness with which political parties have suddenly begun to take sexual harassment.) For the first time in centuries, democracy in Webster’s sense has become a significant force in the UK.
We are still digesting the implications. Should elected representatives be paying less heed to collective outpourings of conviction, or more? The answer isn’t obvious. We excoriate politicians both for ignoring public opinion and for being too much in thrall to it. The inconsistency may be partly explained by a paradox of modern politics: the fact that, in addition to being more informed than ever before, the average voter is also more misinformed. The new digital communications technologies have enabled a secondary revolution in which lies, half-truths and semi-digested facts pollute our collective sense of what is going on in the world, while online confirmation bias protects us from the cleansing effects of alternative points of view. All but the most careful consumers of round-the-clock news are left vulnerable to manipulation, misinformation and misplaced certainty, and the wild excesses of popular political discourse embody this vulnerability in occasionally alarming ways. Sometimes we get it right; sometimes we get it badly wrong. For all the imperatives of democracy, simply saying that the people must be heard and obeyed feels dangerously simplistic.
Yet the thing itself is surely beyond dispute. There’s a new force in politics – let’s call it people power. The people, wise or foolish, right or wrong, demand to be heard; and public opinion has become what the Liberal statesman Lord Bryce suggested more than a century ago that it might become: ‘the giant before which all tremble’. Like it or not, that giant is changing Britain.