The generation that inherited this absurd cultural legacy would scarcely be able to find tenable housing, let alone customise it in accordance with home makeover shows. While the income of young adults had increased by only 19% between 1997 and 2018, house prices had risen by an eye-watering 173%30, creating a generation perpetually locked out of home-ownership. But this generation would also be comprised of an unprecedented number of university graduates, many of whom had been exposed to economic and political ideas alternative to capitalism. In a beautiful stroke of irony, it would be graduates of Tony Blair’s “education, education, education” who would help to bring Marxism back into mainstream political discussion. Additionally, cultural studies would prove to be one of the most fertile and essential academic disciplines of the late twentieth century, with founding father Stuart Hall having already begun the important work of documenting the slow and steady transformation of the Labour Party to align with Thatcher’s vision of Britain by the time Blair was elected prime minister in 1997.31
Hall, and those working in a similar tradition, would be responsible for creating a new strain of intellectualism in Britain, one that was far more rooted in the everyday than the approaches peddled by historic seats of learning like Oxbridge. Cultural studies in and of itself represented a more egalitarian strain of academia, insisting on the academic significance of everyday phenomena, including aspects of the working-class experience and popular culture. These, it argued, could be used as valid points of entry in understanding the ways in which society was ordered, and the systems of power that governed its fate.
As such, cultural studies as a discipline was antithetical to the abstract, market thinking of neoliberalism, and questioned the assumptions that had led to the dismantling of social support and community in favour of rampant individualism and capital. Hall himself credits the origins of cultural studies to the widespread, grassroots campaigning of two groups he was renowned for championing — the Campaign for Nuclear Disarmament (CND) and the New Left.32 Its focal point would often be the experience of working-class immigrant communities, whose cultures had been excavated and exploited for decades without little to no remuneration, credit or concession to the mainstream avenues of politics, media and culture. As Hall writes:
vernacular lived cultures spoke more powerfully than — and as a substitute for — formal politics. Popular religion and urban culture became proxy symbolic resources in which poverty, social discontent, people’s disaffiliation from the system, class interests, racialized divisions and political differences found expression.33
Though he was speaking about the emergence of a black Afro-centric consciousness, this would prove to be the foundational premise of cultural studies more broadly and was equally true of other identities, be those along lines of race or class (or of course, both). In this sense, the creation of cultural studies wasn’t just an effort to give these issues the academic attention they deserved, but to excavate them from the world of anthropology and sociology, where they had been reduced to artefacts through colonial oppression, and to instate their authors and creators as valid narrators of history.
On account of New Labour’s education reforms, interest in the area of cultural studies began to reach critical mass during the 2000s, lending a fresh perspective on some of the political crises of the time and injecting public discourse surrounding the Iraq War with an academic rigour, fomenting a more thorough and widespread examination of the colonial discourse that had been used to justify it. But the appetite to air the ugly origins of British imperialism bucked a larger trend in the cultural output of the 2000s that was otherwise defined by a sense of nihilism and irreverence.
This has been somewhat lazily attributed to the rise of the internet, as if the medium itself necessitated irony. But while the distance and anonymity afforded by the internet arguably encouraged more combative behaviour, its ability to connect people was equally enabling of bonds and allegiances. Far more likely, then, is that the idea that with the help of the internet, a generation was able to more easily and collectively identify and process the harmful permutations of late capitalism. Their cynicism wouldn’t be created by the internet, but it would be shared far quicker and become much more entrenched as a result of it. What was driving it was a legitimate observation of the growing number of false pretences under which society was governed: the false promise of a political class who forfeited integrity for PR nous; the false promise of an advertising industry that eroded the cultures it sold back to us; and the false idea of a more democratic society that really only served to embolden those in power.
The effect that all of this had on the morale of a generation cannot be underestimated, and is fairly well summed-up by David Foster Wallace in his famous essay “A Supposedly Fun Thing I’ll Never Do Again”, based on his experiences travelling on a luxury cruise liner. Wallace dissects the extreme artifice of that experience with the same, bemused outsider perspective that made him famous, using it as a vehicle to explore the methods of deception and delusion underpinning our concept of luxury — how we can transform the fairly sexless and laborious pursuit of seafaring into something rare and decadent. In relation to the advert, or advertorial, that had been written by author Frank Conroy to promote the cruise, Wallace writes that:
An ad that pretends to be art is — at absolute best — like somebody who smiles at you only because he wants something from you. This is dishonest, but what’s insidious is the cumulative effect that such dishonesty has on us: a perfect facsimile or simulacrum of goodwill without goodwill’s real spirit… it messes with our heads and eventually starts upping our defences even in cases of genuine smiles and real art and true goodwill.
The effect of this, he argues, is to leave us feeling “confused and lonely and impotent and angry and scared.”34
Coming of age at a time when the Blair years were being dramatically blown apart, the Bill Clinton legacy torn to shreds and the cultural output of an entire decade cast into doubt, this inventory of emotion resonates with what I and several of my contemporaries report feeling at the time: after the exuberance of the late Nineties, there was a malaise in pop culture that concealed a deep sense of embarrassment and shame. The optimism of the previous decade would seem naive, its garish spectacle casting a long shadow over a subsequent decade characterised by warfare and, later, extreme economic precarity.
The YBAs had transformed art into grotesque caricature, embracing an absurd level of greed and corruption. If Damien Hirst, the Chapman Brothers, Tracey Emin and their peers had been successful in passing off their connections with the likes of Charles Saatchi in the name of exploring the corrosive potential of capital in the first instance, then by the mid-2000s they had become some of its most passionate advocates. It became increasingly apparent that many of these artists had enjoyed the joint privilege of presenting themselves as a challenge to the establishment whilst also enriching themselves via a market that has long stood accused of facilitating corruption and money laundering.35 In 2011 Emin, who had staked her career on being the flagrant voice of working-class Margate, declared her support for the Conservative Party, claiming that a vote for the Tories would be the only valid vote for the country’s creative industries.36
Meanwhile, Blur bassist Alex James, whose band had never been too far away from accusations of Torydom, became a card-carrying member of the so-called Chipping Norton Set — an outpost of the British media, consisting of some of its most famous and vocal Tory supporters and made famous by its involvement in the controversial phone hacking scandal, which in addition to having disrupted the lives of countless celebrities and the families of those involved in the 7/7 bombings, was most famous for causing disruption to the investigation into the murder of schoolgirl Milly Dowler in 2002. The Chipping Norton Set included the likes of News UK CEO Rebekah Brooks, who was eventually cleared of all charges connected to the scandal, as well as David Cameron and his wife Samantha. It also included former Top Gear host Jeremy Clarkson, who in 2015 was dismissed from the BBC on account of having punched a producer for offering him a cold meat platter in place of the steak that he so violently desired. James would prove to be their cheese-touting hipster friend, putting paid to any previous efforts made to try and conceal his — and by extension, Blur’s — status as exports of the bohemian, consumer-shunning middle classes. James would go on to become the creator of several food festivals which have since gone bankrupt, including the cleverly titled Big Feastival, co-hosted with “national treasure” Jamie Oliver. Damon Albarn, on the other hand was following in the footsteps of White Saviour musicians Bono and Bob Geldof, making the musical equivalent to the aid charity Oxfam’s “Give a Man a Fish” TV ad with his album Mali Music, inspired by a trip he made to the country in support of the charity in 2002. This would be followed by a brief foray into international politics with the album Democrazy, which had been recorded in hotel rooms during the US portion of the tour for Think Tank, Blur’s seventh studio album, released in 2003.
The cultural climate of the previous decade had started to look tired and past its best. Publicly, an apathy had started to set in, as Blair’s tenure slowly and inevitably rolled round into Brown’s, and the memory of a Labour Party that truthfully represented the working-class vote in Britain faded to little more than a distant memory.
To answer the cultural vacuum that had been created, the younger generation would turn its attentions increasingly towards the past — a past beyond the Nineties, hailing a resurgence in the popularity of guitar bands, closer in appearance and sound to the musicians of the 1970s than the previous decade, in a trend that music writer Simon Reynolds terms “dyschronia” — a very specific strain of nostalgia for a future that was never fully realised. The rare few artists who emerged triumphant from these lost years, for example Amy Winehouse and Adele, would bank on this sense of chronological disjuncture. Emerging from the pubs and bars of Camden and east London and advertising their shows via the digital flyers that were shared on Myspace, their music riffed on the sounds of far older musicians such as the Ronettes and Sandy Shaw.
Grime presented the only real exception to this, but its ascent was far from straightforward and, to the extent that it was still largely maligned by the establishment, its wins were often outweighed by its setbacks. Dizzee Rascal’s 2003 Mercury Music Prize win was followed by wins for a number of bland, middle-of-the-road indie acts and Radio 2 fodder. While there was jubilation for Skepta’s win a whole thirteen years later, there was also an uncomfortable wince for the way that the industry was still touting the genre as “emerging”, fist-pumping the air with self-congratulation despite having failed to heed the call of its enormous fan base over a period of several decades. Journalist Chantelle Fiddy recounts a similar experience in an interview for Inner City Pressure. Talking about her experiences as a music journalist during the early 2000s, she claims:
There’s a core issue with many editors… They simply can’t see past their own socio-economic background and class reference points. Pitching Wiley features to Mixmag in 2003, they’d say “no one has heard of him”. Which was true, if you asked the attendees at Cream and Ministry of Sound, but if you walked through Mile End with him, he was a street demigod. It’s narrow mindedness, and it perpetuates social division and the underachievement of any act not appealing to middle-class journalists.37
It’s fair to say that grime now constitutes one of the main pop cultural strains in Britain, but not only has its trajectory to the top taken far longer and been far more fraught than almost all others, its eventual and hard-won success is now also being touted by the same industries that discriminated against it as a symbol of greater inclusion, whilst also distracting from the many ways in which these industries have otherwise become far more elitist. In what now looks like a fairly stark case of racial bias, and after the mainstream media’s fairly unequivocal abandonment of garage by the mid-2000s, Mike Skinner would be the main profiteer of a genre built by black working-class voices. This isn’t to render his contribution invalid, but instead to highlight the very limited and tokenistic concessions that were being made to the working-class voices of Britain during this time. Though the cultural output of the 1990s had undoubtedly favoured the white working class, garage and the slow emergence of UK rap had carried the promise of a more representative mainstream. That trend was reversed during the 2000s, characterised by a mawkish doubling down on the nationalist sentiment that had first been legitimated by Cool Britannia and which underscored the musical output of a generation of indie acts caricatured by the Libertines.
Nevertheless, Skinner did emerge as a valid voice of the working classes during this time, and I was grateful to him for lending something of a mythology to Birmingham and the people it contained, the piano sequence to “Has it Come to This?” becoming the theme to a million hours spent lounging in the park or in our friends’ back gardens. What marked him out from the generation that came before him was a sense of melancholy for the marginal existence into which he’d been plunged. This wasn’t the proud cry of a dole claimant who rejected the bigotry of Thatcherism and wanted to proudly elevate working-class culture in the way of Oasis, but a young man’s lament for having been stripped of all dignity, promise and respect. In this sense, Skinner was one of the first voices in pop culture that stood up and asked what was left for those who couldn’t seize on the limited opportunities afforded by Blair’s education reforms and programmes of social mobility.
Elsewhere, other emerging trends in pop culture included an irreverent, sharp strain of American writing. In the early 2000s, magazines like VICE, Gawker and Nylon created the wry voice that would come to define the language of the early internet. They were building on a tradition first started in the 1980s and 1990s by Brat Pack writers like Jay McInerney, whose novels found a new audience in this disillusioned and nihilistic generation. Nevertheless, there was a groundswell of people who were sharp and had ideas, who through the internet were building a new style of communication that brought together their critical sensibilities with a love and respect for the pop culture on which they’d been raised. This generation would inherit a climate of widespread scepticism towards the political class and the media, and, with greater opportunity at its fingertips, would begin experimenting with self-publishing and broadcasting. The internet obviously gave people from disparate places and communities the opportunity to converge and share ideas, as well as a chance to publish work without the support of the traditional middle men or cultural gatekeepers. Though it was a promise often built on illusion, and the narratives of popstars being discovered on Myspace were often little more than marketing myths dreamed up by record label bosses, the ability to publish online nevertheless gave young people the hope of being able to take back the reins. This, coupled with unprecedented levels of education, seemed to spell a new era of opportunity for people from low-income backgrounds living outside the well-trodden routes to success, of being able to create and publish work.
But the emergence of the internet as a route to self-determination had only started to fully materialise when the world stood on the brink of immense economic upheaval. It would leave millions of people rootless and without hope, including a generation comprised of a huge number of graduates who lacked the basic means with which to meaningfully seize on these new avenues available to them for the reason of having to prioritise survival. The “Class of 2009” faced a seventeen-year high in graduate unemployment38, with the Higher Education Careers Service Unit reporting at the time that 1 in 10 graduates that year would be out of work, with many more in stop-gap work, such as bartending or waiting.39 In the years that preceded the crash, many experts and spokespeople of a new cyberutopia had predicted that the internet would herald an explosion of transnational communication and collectivism of the free market and dot com variety, creating new avenues for enterprise and revenue. Few of them, however, had predicted the internet’s more common use in being the platform of the millennial’s lament. While some affluent young people, mainly contained within the Bay Area, might have used their private wealth funds to create tech start-ups and e-commerce brands, a far greater number would use it for the more nihilistic purpose of broadcasting thoughts about the realities of life under late capitalism.
This generation would often be characterised by the commentariat as lazy and apathetic, without realising that in reality it felt no affiliation to the political climate it had inherited, rather than a disillusionment with politics per se. With the Labour Party transformed into a Frankenstein’s monster of a supposed opposition, the political class exposed as little more than shrewd marketeers, and the apparent fact that a war could be waged even in spite of mass uprisings, who could really blame these young people for feeling anything other than a sense of immense helplessness and despair?
The first glimmer of hope came with the student protests of November and December 2010. Held in response to proposals by the UK government to cut higher education spending and to raise the cap on student fees (specifically, they planned to triple university fees and to scrap the Educational Maintenance Allowance (EMA) that had been providing students from the poorest backgrounds £30 a week to offset the financial strain of not being in full-time employment), the protests were also the first major articulation of resistance to the austerity programme that had been declared by then-Chancellor George Osborne in his June budget of that year. Those gathered were angry with the newly formed coalition government, whose cosignatory, the Deputy Prime Minister and leader of the Liberal Democrats Nick Clegg, had pledged only months earlier to vote against any proposed increase in tuition fees as part of his general election campaign. The betrayal was stark, and symptomatic of a political class that lacked integrity and principle. Riding on the coattails of the cataclysmic financial crash, the protests built on the rising resentment towards the recklessness of the banking sector and the rampant consumerism and debt culture of older generations, but also the aggressive ideology that the Conservative-led coalition had employed in response. That these young people would not only be responsible for redressing the economic uncertainty they’d inherited, but also saddled with an unprecedented level of debt in order to attain the qualifications that were now presented as the customary means of entering the modern workplace — a modern workplace that was failing to provide an adequate number of real jobs — was a cacophony of injustice capable of finally obliterating their longstanding malaise.
In the short term, the efforts of the student protesters would prove in vain: in December that year, parliament voted to raise the ceiling on annual tuition fees in England to £9,000, a move that was widely adopted by most seats of further education in Britain, making it prohibitively expensive for many working-class applicants. This would mark the first of many policy decisions by the newly installed coalition government that showed an arrogant disregard for the vast majority of people living and working in Britain, including the now infamous bedroom tax, the denial of benefits to 165,000 disabled people and school cuts. With Labour dead in the water and their traditional voter base lacking any kind of real representation in parliament, the new coalition ran riot, feeling no pressure to appeal to the working class in the way that even Thatcher or John Major had.
Over the coming years, however, the movement that had started to form in the wake of the student protests would begin to snowball. In many ways it would set off a chain of events that were capable of eroding some, if not all, of the irony poisoning that had previously defined a generation, and it would be propelled by the teachings of an academic tradition heralded by the likes of Mark Fisher and Stuart Hall. In the first instance, it would do so largely unnoticed by the mainstream media, whose oversight would prove damaging to its lasting credibility and only exacerbate the sense of disconnect between this newly-energised youth and those in charge. This would pave the way for a grassroots media that was better equipped at reporting on the phenomena taking place under the radar of the metropolitan elites. Part of the problem was that in the wake of the financial crash, industries across the board — but specifically the media, which had always operated a level of unchecked nepotism — neglected to hire many young people from low-income backgrounds. And what many of us working at the edges of the industry knew and felt anecdotally would be vindicated by the number of false predictions made by the incumbent media in the years that followed.
For a few shrewd people present at the student protests, the events would not just prove symptomatic of a political order that was failing many people — not least young people — but they would also demonstrate the appetite that was emerging for a new era in politics, and a new era of information and culture. This is a story told with astute observation by Matt Myers, whose book Student Revolt: Voices of the Austerity Generation painstakingly traces the lineage from the student protests of 2010 to the enormous swell of support for Labour Party leader Jeremy Corbyn in 2015 and beyond. Corbyn made the abolition of tuition fees a priority of his 2017 election campaign, a fact that’s been credited with the twenty-five-year high in youth turnout. But fees had become a proxy of political attitudes, rather than the issue on which entire elections would hinge.
Many of those who contributed to Corbyn’s success, including co-founder of the pro-Corbyn campaign group Momentum, James Schneider, and founder of challenger news outlet Novara Media, Aaron Bastani, had been present at the student protests, and built support for their respective causes through the communities that were forged there.40 Momentum’s aim was to circumvent resistance within the Parliamentary Labour Party, which sought to undermine the Corbyn leadership and impede its progress towards electoral success, by harnessing the huge swell of support for Corbyn on the ground. Using the internet as a means of reaching this support base and educating it on the procedural methods of the party, it aimed to affect wider, organisational change, including the election of local councillors and members of the party’s various governing and steering committees. Novara, on the other hand, planned to install itself as a legitimate left-wing challenger to the British media, whose spectrum still largely only extends as far as the liberal centre-left.
In both cases, the style of delivery would be predicated on a sense of wit and incisiveness, making them relatable, but more importantly, exposing the Troy McClure technocrats at the steering wheel, as well as the psychological manipulation of advertisers and the biased agendas of even apparently neutral outlets such as the BBC. The legacy media would broadly respond by trying to mock and delegitimise these emergent new platforms and their spokespeople, denigrating them on the basis of obscurity and smugly seeking to lump them in with the same internet trolls who had propelled the right-wing cause in America, or the left-wing conspiracists over at Skwawkbox and the Canary — an argument essentially hinging on the fact that these outlets existed online, which would seem absurd to a generation who had grown up with the internet and viewed it as a fundamental part of everyday life.
In the run up to the 2017 general election, the legacy media made a grave mistake in underestimating just how far it had estranged the general public, and how hungry people were for an alternative. Its authority was no longer intact, and what supremacy it had once enjoyed was based on a monopoly model that the internet — with its infinite opportunity and abundance of information — had undermined. The print media in particular would misattribute its struggle solely to the internet’s offering of more free stuff, without realising the very subtle but crucial difference: that it was easier for people to access a variety of different viewpoints and thereby judge the legacy media to be both deeply homogeneous and biased. While there might have been some truth to the legacy’s media’s claim on more rigorous editorial standards, its blind spots have also served to undermine this message. As the realities of austerity began to set in for working- and lower-middle-class people up and down the country, the establishment media would start to feel increasingly out of step.
In a study that was widely quoted by the British press in 2011, it was claimed that in the years that followed the economic crisis, a third of all Brits found work through family connections and friends.41 Education alone would not suffice in the modern workplace, but as the report’s author was quoted as saying in a Telegraph writeup of the time, “In this tough climate it’s essential to develop contacts and relationships in your chosen field whilst also bringing other skills to the workplace such as self-motivation, dedication and leadership qualities”.42 Nowhere was this more true than in the highly nepotistic media, and the implications were twofold. On the one hand, there was the very real fact of fewer job postings and a diminished transparency of hiring policies leading to fewer opportunities for working-class people. On the other, hiring policies were expanding to include a wider range of criteria, not just spanning the educational and experiential, but also relating to certain traits of personality, including confidence, sociability and networking abilities. On top of the education requirements that Blair had essentially stipulated as necessary for entering the modern workforce, applicants from working- and lower-middle-class backgrounds were also required to study the behaviours and vernacular of their middle- and upper-class peers, whose schooling and social background had furnished them with these skills.
In the ultra-competitive climate borne of the media and creative industries’ tightened purse strings post-2008, it would be the stories and output of the people who most effectively embodied these traits that would prevail. The outlets declaring themselves to be the chief oppositional force in the war against misinformation and fake news often failed to see their own limitations and biases. As the true horror of austerity started to unfold, leading to unprecedented hardship defined by rising homelessness, joblessness, hunger, poverty and a mental health crisis, the cultural climate of Britain had never looked so wealthy or posh, doubling down on storytelling set squarely within the middle-class experience, the aristocracy and the entrepreneurial class. This belied a huge amount of suffering that was being experienced by the poorest in society, little of which would ever find its way onto TV or into the mainstream media, but which the internet was doing a great job of surfacing nevertheless. Meanwhile, in would step a new era of models, musicians, actors, writers, fashion designers and artists who were blue-blooded and proud. Kate Moss’ Croydonite “Get the London Look” — the line used in Rimmel ads post-Cool Britannia — would be replaced by Cara Delevingne’s far plummier version, which would steadily become the norm, with working-class models such as Jourdan Dunn emerging as outliers in an industry predominated by privately-educated elites. Meanwhile, the Nineties figure of the posh anomaly, typified by Hugh Grant, would become the model for a new generation of actors, including Benedict Cumberbatch, Eddie Redmayne and Tom Hiddleston, who became inescapable during the 2010s. Likewise, the world of theatre would largely abandon any efforts to democratise, positing Polly Stenham as the voice of young London, despite her being the heir to Unilever millions and a graduate of the twelve-grand-a-term Wycombe Abbey girls school. Festival line-ups would steadily replace a broad church of weird, working-class Nineties bands and those few grime artists who in the early days were able to break through to make some headway in the mainstream — Wiley, Dizzee and Lethal Bizzle, for example — with Florence and the Machine, Mumford & Sons and the Maccabees (whose members include an Orlando, Felix, Rupert and Hugo). Under the blue-blooded auspices of David Cameron and his acolytes, and following the flagrant abandonment of the working classes by the Liberal Democrats and the centrist factions within the Labour Party, the wealthy elites were given carte blanche to flaunt their culture, almost to the derision and mockery of everyone else. Gone was the shame of being posh that bands such as Blur had suffered under.
Kill All Normies has over-simplfied the trajectory of cyperutopianism by focusing solely on the culture wars of Trump’s America and highlighting the fatal lack of leadership and central ownership of online movements from Occupy to the Arab Spring. But rather than being characterised by a swell of online vigilantes, both Momentum and Novara harnessed the internet for strategic gains in order to circumnavigate an increasingly monocultural media. In retaliation, the legacy media has often seized on a narrative whose limitations Nagle’s own analysis also falls victim to, focussing solely on a very particular, right-wing strain of online political organisation and dissemination of ideas. Nagle goes on to state that as the old media dies:
gatekeepers of cultural sensibilities and etiquette have been overthrown, [and] notions of popular taste maintained by a small creative class are now perpetually outpaced by viral online content from obscure sources, and cultural industry consumers have been replaced by constantly online, instant content producers.43
While it would be hard to deny this shift in audience models, and the fact that the internet has irrevocably changed the face of creativity and cultural output across the world, this assessment is still too strong. The establishment gatekeepers are rattled, but at the time of writing, their success in seeking to undermine these new challengers must also not be underestimated. The legacy media will continue to peddle the myth of its superior quality compared with the citizen outlets emerging online for the purposes of justifying its continued existence and perpetuation of the status quo; and while we as consumers must remain vigilant to the funding models and ethical standards of any new emerging outlet, we must remain equally vigilant of these campaigns made by the legacy media to discredit any challenger.
What became increasingly apparent during the 2010s was how far the media had become complicit in the politics of neoliberalism, its gatekeepers hailing from an ever diminishing pool whose interests were largely served by the status quo and who lacked the vested interest to oppose it. The turnout of young voters to elect Jeremy Corbyn as leader of the Labour Party in 2015, and then re-elect him a year later, was confirmation that their appetite to engage with the issues and policy decisions affecting the whole of society had been reignited. And to continue the strain of theoretical analysis that many had first been exposed to at university, they would require a media that was willing to interrogate the structural factors governing society, rather than a media that served as a complicit extension of them.
The internet would prove to be both one of the means, and a way of measuring, the extent of this younger generation’s newfound “wokeness” — a term that has latterly been ascribed so much derision as to become almost meaningless, but which in the first instance did well to characterise the millions of mainly young people who had cut their teeth on academic theory of the post-structural and post-colonial variety. There was a sense that the veil had been lifted: pacifism and anti-imperialism would no longer be marginal causes championed exclusively by minority communities who stood to gain the most from abolishing structural racism and elitism; dusty intellectuals; and your one jangling aunty who lived on the south coast. Instead, they would become the cornerstone of a generation determined to do things differently. More crucial than the fact that more young people than ever before sought higher education, was the fact that they were now using it not just to add to their CV, as was intended by Blair, but to further their understanding of society, culture and politics, with a view to improving and shaping it for the better.
The capitalist thinking of the preceding two decades had dictated that more people than ever before would pursue education not merely for economic purposes, but to fulfil some greater, moral obligation and sense of duty. It was easy for those in government, and even an older generation, to continue perpetuating the myth of education as a means to success and prosperity — and to use this to justify the hike in fees. But those on the ground had been forced, through the indignity of putting themselves through the stress and financial burden of university, only to wind up broke and jobless, to confront the limitations of that logic. It would be them, armed with their newly-attained powers of deduction and critique, who would begin to analyse their mounting anger and interrogate the systems that had led to their limited fates. Meanwhile, the transformation of British culture into a who’s who of public school alumni was the backdrop against which a decade of unparalleled hardship and austerity was wrought, adding insult to injury for the millions who suffered, and demonstrating unequivocally that their kind would only ever be permitted providing it serve the interests of the upper classes. As I will detail more in the following chapter, it was a decade which will be largely forgotten for its cultural output, which was largely insipid and dull, but which belied a rising current of resentment and anger, from which one of the most exciting and revolutionary periods in British politics would ultimately emerge. The intellectual strain that had not only informed the student protests but been cemented by the meeting of minds which happened there, would far outlive the student protest occupation of the Tory Party HQ at Millbank Tower.