Friday, May 02, 2014

Hard Men of Letters

Mary Beard’s insightful (and suitably uncomfortable) lecture on the suppression of the public voice of women is a good starting point for a more general discussion of oppressive language in the digital age. This is no small matter of etiquette. Abuse is not everyone’s cup of tea. While some revel in it, many others are put off from bothering to contribute. This is not just an infringement of their rights but a loss for us all. It allows the oppressive a greater say than other, perhaps more rewarding, commentators.

First then, why might someone tend to lace their postings with aggression and personal abuse? Here’s five possible reasons:

1. The writer is incapable of having his views questioned without becoming enraged.
2. The writer is prone to acts of violence, and given access would physically assault those who dare to disagree with him.
3. The writer is physically a wimp who vents his wimpish frustrations by talking tough from the safe end of an internet connection.
4. The writer has decided that a bullying tone is the standard vernacular for digital discourse.
5. The writer’s arguments don’t stand-up to intellectual scrutiny so he uses a threatening tone to scare-off those who might expose this shameful secret.

For once you'll notice we can dispense with gender neutrality. For all the media’s fascination with savage females, Raoul Moats still outnumber Joanna Dennehys by hundreds, perhaps thousands to one. We can be confident that similar ratios apply when it comes to vicious and threatening writing.

Secondly you’ll notice that as character portraits go, they're none too pretty: 1 needs counselling, 2 is a thug, 3 is a coward, 4 is a conformist poseur and 5 is Alf Garnett. As to motives, we can identify two key areas - the emotional and the strategic - with a certain amount of cross-over. Of the emotional, some suggest a simple lack of self-control, a lashing out, a tantrum. Others crave emotional victory, to make the writer feel better and the victim feel worse. Another emotional drive is the desire to paint a certain image of oneself to other readers, in this case one of toughness and ruthlessness. It's a narcissistic projection of a 'tough guy' image into the world, much like the John Wayne swagger or shaved head and Doc Martin's of yore.

Then we have the political and strategic motives - notably number 5. The hope here is that by roaring like a lion you will frighten-off those who find holes in your arguments or beliefs. We might call this usage ‘rhetoric for meat-heads.’ It functions like rhetoric, in as much as the objective is to change the subject. Of course true rhetoric takes skill - you need to maintain the pretence that you haven't changed the subject - but it's a toss-up which is more despicable. Personally I'd rather be called an ‘effing something-or-other’ than get caught-up in the lawyer-like twisting of an Oliver Kamm. At least when someone bad-mouths you, you can immediately call 'game over'. With rhetoric, irrelevant but plausible sounding distractions are dangled like worms into the pond, and it can be very difficult not to bite.

Of course you don't have to roar on your own. On like-minded websites and message-boards the regulars may be roused into collectively rounding-on a detractor. This works even when the detractor has made a valid point, and the consensus view is at fault. As in the playground, combined snarling can produce an illusion of winning the day, seeing the victim off, belly down. Truth is less relevant when you're backed by a mob.

Still that’s all the obvious stuff. Obscene names and threats to disembowel are easy to condemn, and most professional journalists and commentators distance themselves outright. But there are more subtle ways to harness the power of abuse. While such writers might be more dexterous in avoiding the accusations, the aims are as unclean - probably some mix of those suggested above. They abound in an article about Mary Beard, which she mentioned in her lecture. It was written by Rod Liddle for The Spectator. I was going to call it a particularly mean-spirited article, but looking at his other entries it seems a mean spirit is at the heart of this column.

So, how to abuse with impunity? One way is to distance yourself from the abuse you wish to convey. Rather than throw the mud you can highlight the mud others have thrown. Liddle’s article begins with two quotes, the second of which is an example of the online abuse Beard received after a TV appearance.  It seems fair to assume that Liddle cherry-picked this quote because he likes it - he finds it funny. It has the same ring of ‘crass intellectual’ he projects in his own writing (for evidence of the ‘intellectual’ bit look at the first quote.) But by pointing rather than slinging he can have his cake and eat it. He can even play it that he was only quoting it so as to disapprove (tut-tut.) The damage is done either way.

Alternatively, you can put your own abuse into the mouths of others.  Liddle suggests that Beard is frequently invited onto TV because “They think she looks like a loony. And the TV companies, the producers, love that.” While we should never doubt the cynicism of TV producers neither should we doubt that this is Liddle’s own sly way to call names. It is his word choice, no one else’s. ‘Her eccentric appearance’ would have conveyed his meaning just as well, if it hadn't been his intention to abuse.

Alternatively, you can make sensible, critical, points but verse them in an unnecessarily harsh manner. Rather than suggest that someone is naïve you can ask ‘Is she really that thick?’ (Liddle, again.) And if all that fails there’s always the ‘satire’ fall-back. You can paint your abuse as a joke: This is a light-hearted column. Have these people no sense of humour?

This all points to an interesting aspect of abuse, one that is easy to forget. In terms of information, abuse is a void.  As the IT theorists put it, the content of abuse is redundancy rather than information - at least regarding the subject being discussed. Abuse doesn’t take a discussion any further forward, it’s a means of sidestepping it - rhetoric again. 

For example, I might wish to propose that George Osborne's social background impedes his ability to empathise with other social classes. To express this I could refer to him an 'Eton-educated aristocrat who is clueless about the suffering of the poor.' Or I could call him a 'toffee-nosed Bullingdon-boy whose only experience of the working classes was the servants he beat at Eton.' Now, while the latter description might be more colourful and perhaps raise a smirk (though barely, as it's so hackneyed) it takes us nowhere regarding the original proposition. It might serve to garner dislike of Osborne and his class (perhaps one of my ulterior motives) but it adds nothing to the question of whether his social background blurs his view of other social classes.

This is further complicated by the fact that some words qualify as abuse in some contexts, and information in others. Calling Mussolini a fascist isn't abuse. Calling a Southern Water customer-service operator a fascist certainly is - however steep your bill. Calling a UKIP candidate a fascist is probably just abuse. If you harbour the suspicion that UKIP could be the seed of a growing fascist movement, then we could grant it some informational content. But in all likelihood it is just abuse. 'Myopic Little-Englander’ is certainly an abusive description, but in this context it is far more information-rich. Given the welcome absence of Jack-Boots, at least this description gives you and the candidate something meaningful to chew-over on the doorstep.

More confusion arises with the mixing of abuse and information, even in the same sentence. Just for a bit of balance here's a concocted but typical sentence one might read on a left-wing message-board, or comment thread:

“It wasn’t even Marx that said that, it was Hegel - you twat!”

While the part to the left of the dash might well constitute information, that to the right is pure abuse. Rather than substantiate the argument it functions more like a slap round the head, for emphasis. And the parallel is telling. Aggressive talk is often the prelude to aggressive acts. It all depends whether we are dealing with a number 2 (thug) or a number 3 (mouthy wimp.) And we can certainly smell some number 4 here - the conformist poseur. You have to ask, what is someone who knows the difference between Marx and Hegel doing using such a term, at least in public setting? Who or what are they pretending to be?

Comedy can’t steer us round this distinction, either. As with any other form of abuse, the abusive content of satire is void of information relevant to subject - no matter how funny or satisfying it might seem. Politically, satire is a sop. It’s the momentary satisfaction of flicking the V behind the headmaster’s back. It changes nothing in the structure of the school. Steve Bell can continue to draw Tony Blair with grotesquely dissimilar-sized eyes, and Peter Hitchens can continue to refer to him as ‘The Blair Creature’ for another twenty years. It won’t move him an inch closer to The Hague. Only the facts of Blair’s time in office can do that, and we are stuck with a media determined not to inspect them. So enjoy that satire, but don’t kid yourself that it changes anything.

The internet has certainly put a lot more ugly words on public display. Outside journalism, thuggish writing used to be confined to the toilet wall. Nowadays any semi-literate with a cell-phone has a global noticeboard for all their unpleasant recommendations. But let’s not understate the role of the old media in all this. Liddle has the gall to bemoan our fall from grace - ‘the internet has shown us as we really are, which is not terribly nice’ - he whines. This is certainly evidenced by the comments section beneath his column, but can he really be surprised considering the content of his posts? Has it not occurred to him that a mean-spirited column will tend to whip-up a mean-spirited crowd? Is he really that naïve?

So while the increase in abusive writing facilitated by new media is regrettable, let’s not pretend we’re powerless. Abuse builds like a head of steam. We can choose not to add to the pressure. While we cannot coercively control the utterances of others, we can keep our own houses in order:

Thou shalt not resort to mean or threatening language, regardless of how rude or incendiary one might find another person’s writing.

…or some such. This really can have a positive knock-on. Abusers look so much odder and more isolated when no one else will join in. They really are left barking in the dark. It’s not always easy of course. This article took a lot of editing to remain consistent with its own message. When the temptation arises we need to think carefully about our motives. We shouldn’t fool ourselves that this particular insult is necessary to establish a finer point. Abuse can never clarify an argument. It’s always about something else.

Monday, February 10, 2014

Whatever happened to Channel 4?

How did Jeremy Isaacs’ daring, visionary child of 1982 grow into the shameless, ratings-chasing harlot of 2014? The answer is obvious enough. Since television went multichannel, and advertising revenue was spread thinner than Marmite, innovative commercial television has all but died.

Forgive my harking back to a golden age, but there is good reason to see the franchise arrangements of mid-80’s as just that. Just four channels - two licence-funded, two commercial; two safe and populist, two risky and rule breaking. If anything, the commercial minority channel was the more daring - 4 regularly outshone 2. With only one competitor selling advertising time Channel 4 could charge fabulous rates, and so commission pretty much any program it dared. The lack of alternative channels guaranteed a large enough rump audience to still make it worth the advertisers’ while. And for the viewer, less ‘choice’ meant a greater chance of bumping into something unexpected, radical, even life changing. Or perhaps, better still, switching-off and sampling lived reality for a bit?

Such were the days. Nowadays poor old Channel 4 News can’t give its breaks away. You can often sail through the whole hour without suffering a real advert - most ‘adverts’ these days are just trailers for C4’s higher-rating programmes. It’s a small mercy considering that during the rest of the evening we can now expect four lengthy interruptions per hour, rather than the old standard of three. 

And as for programme content, the sheer desperation is enough to make Raymond Briggs’ Snowman melt in embarrassment. The worst culprits fall under the umbrella ‘broadcasting for the enrichment of Davina McCall’. First, the game-shows, where members of the public who, oddly enough, also look, dress and talk like Davina McCall demonstrate their profound ignorance of anything of consequence. The noble objective, naturally, is to win a huge sum of money, perhaps enough to afford to bump-into Davina in Ibiza next summer.

In stark contrast there are the celebrity game-shows, presented by Davina McCall. It’s a simple but winning formula: unusual task + celebrity = television programme, and all the more affordable as in this context ‘celebrity’ refers to anyone who has ever walked past a TV showroom. It turns out that Derek and Clive’s ‘Celebrity Saviours’ and ‘Blow Your Tits Up’ were not satire, but premonition. This month we have ski jumping + Anthea Turner; next month toenail cutting + Bez.

Another growth area is ‘Voyeurs and Exhibitionists’. This got properly underway back in 2000 with Big Brother, hosted by Davina McCall. It's diversified hugely since then of course, one notable incarnation being the pseudo-documentary. This genre is particularly attractive to Channel 4 as it allows it to trade on its former reputation for daring themes and controversial social comment. But the veil is thin. More often than not they’re just "Fuck me, Doris!" stories, as one Murdoch drone famously put it, in reference to the reaction she hoped to provoke in her readers. We might call them ‘the shock of the “eeyew!”’. Earlier centuries employed a more honest term - freak shows.

So roll up, roll up, for exhibitionist gypsies, trans-genders, welfare recipients, victims of severe birth defects or bodily disfigurement, obsessive compulsives and the morbidly obese - all lined up before the inspecting eye of the ghoul-squad, tuning-in to express their compassion, feigned or otherwise, or as often, open disgust and contempt. If this seems unfairly caustic then one only has to ask, why now and why so much? Previous generations of viewers managed to get by without wall-to-wall footage of the bizarre and unfortunate. Even allowing for the possibility that some people do watch these programs in good faith, you can be sure that for the broadcasters the distinction is irrelevant. All that matters is that people are watching, and advertising space can be sold. 

When advertisers can pick and choose where they advertise they will inevitably call the shots to the broadcasters. In fact there is only one shot they call - ratings, ratings, ratings. Humans being humans, the traditional themes of sex, violence and material wealth will always be the most eye-catching. So television ends up as a competition to screen the most hyper-sexualised, hyper-violent, hyper-gross television, placing the cheapest, rudest, meanest, strangest, brashest and vainest people centre-stage. As in a war of combat, in the war for ratings truth is a rapid casualty. Audience figures are sure to be low for facts that people don't want to hear. So if you want a documentary on global warming, for example, feel free to commission one based on falsified data. You can then frame the consequent furore as 'healthy debate' (the same tactic, we note, currently being employed to publicise 'Benefits Street')

Multi-channel TV is a disaster, presumably irreversible. This should come as no surprise. It was what every old-school television executive predicted. We only had to glance across the Atlantic to see the coming horror. Channel 4 is by no means the UK’s worst example - it is frequently out-barrel-scraped by license-fee-funded BBC3, using depressingly similar programme formats. But whereas BBC3 was always meant to be dreadful, to match the dreadful standards of its day, Channel 4 is a lost gem. It’s the clearest example of the degeneration that necessarily occurs when commercial television becomes the servant of the sponsor, rather than the other way round. We can safely assume that it wouldn't have been granted a license in the first place if its current output could have been foreseen back in 1982.

Monday, December 09, 2013

What if the IRA had never fired a shot?

What if the Provisional IRA and all other Republican paramilitaries had never fired a single shot, or planted a single bomb? Would the situation for the Catholic community in Ulster be better or worse today? I’m not posing this rhetorically, but as a point of discussion. Many will jump to respond that events such as Bloody Sunday render this game meaningless, worthless. Ulster Catholics would not, could not, and perhaps should not, have simply turned the other cheek. But whatever your take I only ask you to suspend it for a moment. We can be sure at least that this was not a logical impossibility, and not without historical precedent. For better or worse, it is not unknown of for humans to respond to intense violence and oppression with a stoical determination not to respond in kind.

Jumping straight to outcomes then, the important ones would seem to be twofold: death and politics. Assuming an absolute and unconditional commitment to non-violent protest by the entire Catholic community of Ulster, what might have been the outcome in these two areas?

First, deaths. Would there have been fewer casualties in the protestant community, less deaths of British soldiers and mainland British civilians? - undoubtedly. This is the unambiguous good news. No Claudy, La Mon House, or Enniskillen; no Newry or Warrenpoint; no Guildford or Birmingham, or Brighton Grand. All those people alive and intact today, assuming they hadn’t died since in a more peaceful manner. All those tears never shed, and hatreds not fomented.

What about fewer deaths in the Catholic community? This is more contentious. If you hold the ‘cycle of violence’ view then again yes, undoubtedly. Like the Tango, it takes two to create a cycle of violence. Every Catholic killed in a reprisal for an IRA shooting or bombing would presumably still be alive today, or represented by the children or grandchildren they never had. But as we've only asked Republicans to lay down their arms it's harder to gauge. Violence against the civil rights movement cannot be framed as ‘reprisal.’ That was prime-mover violence, original sin, and we have little reason to suppose it would have diminished if the Republican side had remained passive. Indeed the IRA liked to paint itself as the armed protector of the Catholic community. To whatever degree that was true, then perhaps some Catholics lives were in fact saved by the IRAs presence.

What about those killed by British soldiers? Would the British army have been in Ulster in the first place? Legend has it that they were first sent-in to protect the Catholic community. If this is true then it is possible to frame all violence by the British army as a response to Republican violence - the cycle again. It depends how much you believe the British army and British government were there as neutral peacemakers, maintainers of empire, or in cahoots with the loyalists.

Next, politics. For Ulster Catholics, would the political world of 2013 be better/worse/comparable if the IRA had never fired a shot? Would a body comparable to the current Northern Ireland Assembly have arisen without 30 years of bloodshed? Assuming it would have, would the Catholic hand be stronger or weaker on such an assembly? Would the treatment of Catholics as a class in Ulster - housing, education, job opportunity - be better or worse? How would the mutual opinions of a catholic and a protestant stranger differ today, sitting opposite on a Belfast bus? Would there be more or less fear, respect, suspicion?

While we can be sure things would not be the same as they are today, we can be equally sure that they would not be the same as 1969, either. Northern Ireland was not set in aspic, and the Troubles were not conducted in a vacuum. Britain and Ireland sit adjacent to a vibrant continent which itself underwent huge changes over this period - notably the collapse of dictatorships in Spain and Portugal, and the fall of Communism in the East. Much of this was achieved through political pressure, internal and external, rather than bullets and bombs.

And of course there is the EU itself. It’s not beyond possibility that it might have exerted greater pressure on Britain if Britain and the Loyalists had been the only belligerents. With the addition of IRA bombs however, perhaps it was easier for the British government to paint the whole thing as a war on terror - Britain under attack, rather than the rights of Irish nationalists. Could continental Europeans even discern the plight of the Catholic community through the fog of IRA incendiaries? God knows, most mainland Britons couldn't.

I stress again, I am not posing any of this rhetorically. I really haven’t a clue. But it's something that deserves thought. Hopefully Ulster has been through its worst pain and is safely on the other side. But the world is still full of comparable disputes. Anyone considering lending support to one party or another, materially or vocally, might do well to consider Ireland's case before pledging.

At this point one might ask, why the hell should the oppressed be the ones to lay down their arms, rather than the oppressors. I can only suggest, for reasons of pragmatism. The asymmetry of the forces involved in disputes like the Troubles make military victory all but impossible for the weaker side. War proper (though still deeply indecent) is a mutilation contest.  The aim is to out-mutilate the opponent - and victory to the last man standing. This necessarily requires a degree of material equality between combatants. Otherwise it's all over before you get a chance to call it war (you have to settle for calling it liberation.)

In situations like the Troubles, on the other hand, the imbalance of power makes this form of victory near-impossible. The IRA was never going to 'take' Belfast like Monty took Alamein. So they chose guerrilla warfare. This is tactically very different. Rather than out-mutilate the opposition to capture the castle, the guerrilla army uses arbitrary small-scale acts of violence in the hope that this will frighten the opposition into vacating the castle voluntarily.

I won't comment on the morality of such tactics, but permit me one observation about their strategic value. Consider that the IRA's explicit war aim was the dissolution of the province and reunification of Ireland. I don't want to rub salt, but this goal is no closer today than it was in 1969. Indeed Sinn Féin now participates in the governance of the very province it was fighting to dissolve. How does that stand-up as a victory for guerrilla warfare? Is there nothing others can learn from this?

Returning to the original question, for Ireland at least, both answers are awful. If we believe that comparable civil-rights could have been secured by the Catholic community without them taking-up arms, then thousands died for nothing. Alternatively, if we believe that these gains could not have been secured without Republican terror then we would have to conclude that intense violence is sometimes the only way for one group of humans to secure basic rights from another - in which case our whole species should hang its head in shame. My personal hope, without much confidence or evidence to support it, is that the first conclusion is true, grim as it is.

Thursday, October 03, 2013

One person’s base is another’s superstructure

It is a truth universities acknowledge, that academics would be stranded without administrators – well, it’s acknowledged by us administrators at any rate. Of course, academics can reply that without teaching-staff administrators would be equally stuck, perhaps more-so. As education is the end product, it is the teacher who is truly indispensable. While it is possible to deliver teaching without support-staff – say as a freelancer delivering private tuition – it is not possible to administer teaching without any teachers.

Within a large structure like a university, however, it is safe to say that both academics and administrators are essential. Without one the other can’t function, and neither can the institute (– phew!)

This hints at a wider principle, and regular source of confusion: While it is often possible to identify ultimate dependencies, this may not be the best way to comprehend a complex system.

For example, the following three claims seem sound enough: DNA can exist without organic life, but organic life cannot exist without DNA; Brains can exist without intelligence, but intelligence can’t exist without brains; Thought can exist without culture, but culture can’t exist without thought. Each step of the way, we can see the causal arrow pointing firmly in one direction, with the consequence fizzling out as soon as its cause is disconnected, as surely as a TV picture fizzles out when the plug is pulled.

But not so fast; there are more dependencies at play here. While DNA may be essential to life, the profusion of DNA on earth today is itself a product of life. It is living bodies that have borne and protected DNA over the ages, and given it the means to replicate.

Likewise, a human brain will quickly die without the feedback provided by its intelligence. The limited intelligence we all start out with (and often end-up with) is not enough to keep our brains and bodies alive. Other, more intelligent, humans need to be at hand, or drafted-in, to care for us at these times.

And while culture is clearly a product of thought, it also feeds-back into thought. Indeed culture can lie dormant for millennia, without a single oxygenated corpuscle for sustenance. Ancient texts and images can be rediscovered, dusted off, and go on to alter the course of human history. The causal arrow really can point in the other direction.

Speaking of dusty texts, the causal relationship I want to look at here is a political one – the base/superstructure model suggested by Karl Marx. With arguable degrees of metaphor, Marx sketched out a model of society comprising two interrelated layers: An economic base, beneath a social and cultural superstructure. As the ‘building site’ flavour of the model suggests, the continued existence of the superstructure depends on the integrity of the base. If structure is the house then the economy is its foundation. If the foundation caves-in, the house follows.

This principle is clear to see at the most basic level of economics – allocation of food. You can have bread without circuses but you can’t have circuses without bread. However informed your politics, however refined your art, however noble your dreams, all mean nothing without food, without the economic means of staying alive. As both Marxists and Monetarists agree, it’s the economy, stupid. You can’t eat books.

Marx took this relationship and expanded it to describe society as a whole. All of human production – owners, bosses, workers and customers formed the economic base. The rest of society – politics, art, religion, education, entertainment and the military, formed the superstructure. As with the most basic commodity – food – such superstructures depend upon the support of the economic base, rather than the other way round: You can irrigate the Nile Delta without building pyramids, but you can’t build pyramids without irrigating the Nile Delta. You can run Lancashire cotton mills without running day trips to Blackpool, but you can’t run day trips to Blackpool without running Lancashire cotton mills.

These last two points illustrate another deep insight in Marx’s model. Not only does economics support culture, the particular type of economics determines the particular type of culture it can support. If you live in age where the key products are apples, wheat and the occasional hurdy-gurdy, then you will likely spend your evenings eating bread, drinking cider and dancing round the fire. If you live in age where the key products are cars, televisions and computers you will likely spend your leisure-time at the wheel, on the sofa, or immersed in one digital universe or another. 
  
For Marx, the 19th century revolutionary, this all pointed to an interesting conclusion: Ultimate power resided with the workers. Every apparent instrument of power was actually at the mercy of the producing class. Politics and propaganda, church and school, police and army, football and circuses, all fed off the economic base, all were mere superstructure, at the mercy of economics.

With their hands directly on the means of production, only the working classes have the means to pull the plug on the rest of society. Whether workers recognised this power is another matter. The ruling classes of all ages do their best to spread the belief that their rule is true, right and unassailable – even amongst themselves. Maintaining such worldviews has been a key function of superstructure throughout the ages. Call it aristocracy, the divine right of kings, hereditary privilege or racial superiority – it all really boils down to the same thing: One section of humanity persuades another section to do all the nasty backbreaking parts of life, so that the first section can live in luxury.

The extent to which workers might flex this terrible power is another matter again. It holds the potential for wholesale human emancipation, or great mischief, depending on your take. At the micro level, the basic threat to withdraw labour – to strike – remains a priceless bargaining chip, and a major determinant of pay and conditions. At the other end of the scale, a union of unions can lead to a general strike, and bring a whole economy to a halt. ‘Holding the country to ransom’ – as was the shriek throughout the 20th Century.

Rather than the solidity of bricks and mortar, the superstructure starts to look more like a house of cards, built on a rug called the economy. The rug is unmoved by the cards, but one sharp tug can bring the whole house down. We must tread carefully though. As lucid and commonsensical as it might sound, there are good reasons to be cautious before grafting this model onto our own lives. I’d like to identify two areas in which the model can mislead as much as it might illuminate. One is timeless and has dogged Marxism from the start. The other relates to historical changes.

The timeless one can be dealt with quite quickly. The discovery that one factor depends upon another is easily confused with the belief that one factor is caused by the other – shaped, detailed and defined by the other. But this really doesn’t follow. ‘A’ can be dependent on ‘B’ without being caused by ‘B’. Cars run on petrol, and are certainly designed with petrol in mind, but there’s a lot more to cars than petrol.

Similarly, while the social is dependent on the economic, it is not fully described by the economic. Sometimes quite the opposite. Although the nature of a given economic base remains a key influence on the kinds of social structures that can ‘grow’ upon it, there is also feedback in the other direction. Take technology for example. Is that base or superstructure in the first place? The principles behind a given technology constitute knowledge, so presumably superstructure. But such knowledge gives rise to products (base) which modify our way of living (superstructure) which in turn alter consumer demand (base). The process is dialectical, as Marx would have been happy to call it – an on-going inter-modification between base and superstructure.

Failure to grasp this distinction lies at the heart of some of the more vulgar and disastrous readings of Marx. While it is only wise to conclude that it is impossible to build a just and happy society upon a grossly uneven economic base, it is far from wise to assume that enforced equality at the economic base will simply percolate-up through the superstructure, and produce a just and happy society. Indeed, the 20th century was littered with examples of enforced material equality leading to social hell.
 
One might wonder why this misunderstanding is so hard to shake. Political desperation certainly plays a part – who wouldn’t like a simple answer to the world’s problems? But the fundamental problem is the grain of truth. For the genuinely impoverished, economics really is everything. When you are truly hungry food is all that matters, and at present that describes the situation of one in eight humans. It is easy to see why the global poor, and those who care about the global poor, might be tempted to take that basic truth further, try to map it onto the rest of human society.

In truth however, while food and shelter are necessary for a happy life, they do not guarantee it. The meeting of basic material needs usually just facilitates a different level of misery, uncertainty and powerlessness. Rather than being answered, political questions in fact multiply from that point upwards.

The belief that all social injustice can be reduced to economic injustice is certainly attractive, and politically emboldening. It contains within it the hope that an equitable economic base can automatically give rise to human happiness. But while greater economic equality might be an essential step towards peace on earth it is little more than religious faith to think it will necessarily lead there. As with all articles of faith, the burden of proof lies with the believers.

The second confusion takes a bit more explaining. It relates to historical changes since Marx’s day. Let’s begin though by confirming what hasn’t changed: The majority of human suffering remains a direct consequence of material inequality, born of inequality in the ownership and control of production. Billions of humans still live hand-to-mouth, paid a pittance by corporations who syphon-off the wealth produced for themselves.

Just the same, the superstructure erected on today’s economic base still functions in maintaining, justifying and enforcing this inequality. Humans are still numbed by churches, diverted by sport, infuriated by immigrants, heartened by patriotism and threatened by lunatic foreign powers, not to mention the enemy within. And if we still won’t play quietly there are truncheons, bayonets, daisy-cutters and drones to persuade us. And as ever, the fear that prompts this ideological and physical barrage is the same as in Marx’s day: the recognition that workers have the means to unplug the whole operation – should they choose to.

What has altered sharply however is location. While much of the capital of capitalism is still raised in the west, much of industrial production now takes place in the developing world, notably the Far East. As workers, far fewer westerners have anything to do with actually making things. Aside from food and cleaning products, what else in our supermarkets is produced at home? This change of location gives us several reasons for caution before mapping Marx’s model onto our own lives.

Let’s start with production itself, and a commonplace scenario in our new private sector. Let’s say some British capital finds itself invested in a toy factory in Asia. The investors could be shareholders in an established UK toy company, or shareholders in an otherwise disinterested body such as an investment fund, which itself holds shares in an Asian firm. All being well, the toys produced are then sold at profit, and an enhanced sum is sent back to investors in the UK. Interestingly however, the toys come along too. Both profit and product are imported. The toys are bought by a British retailer, say ASDA, who sell them at some mark-up to members of the British public.

Now, the interesting question is where does the British public get the money to buy the toys? It’s increasingly unlikely that they get it from producing anything as corporeal as toys. More likely than not, they get it from working in ASDA, or some other retail giant.

At last, the title of this essay emerges. Rather than an integrated part of the process of production, our own economy starts to look peripheral. Certainly, from the perspective of a politically savvy Asian factory worker the UK economy must look more like superstructure than base. It’s more like a game of monopoly going on overhead, a circular sloshing of pretend money between friends, all the while supported by the genuine wealth producing economies below.

At first glance this might look like business as usual. International capitalism has always involved a hierarchy of productive process, with those countries higher in the chain making the greater profits. Companies and countries whose production centres on intermediate goods, like raw cotton or coffee beans, tend to get the bum deal. Those who produce finished products, like clothing and freeze-dried coffee, are the ones who do well. But can retail seriously be called ‘finish’ at all? Turning beans into a jar of instant genuinely does add value to the coffee. Putting that jar on a shelf doesn’t add a bean. Any fool can do it.

If this still sounds overstated then we can always test it against the causal premise of Marx’s model – the existence of the superstructure depends on the integrity of the base. On this reading it stands: A general strike in China certainly could bring western capitalism to its knees. A general strike among British retail workers (if we can contain our mirth) would barely register in the Chinese economy. A blip in the demand curve, ironed-out as soon as new outlets are found.

However it’s when we turn to the public sector that the full picture emerges. Let’s return for a moment to the profits from the Asian toy factory, as they flow back into the UK. Some go into the pockets of the investors, but a portion goes to the treasury in the form of tax. Consequently some of this goes to fund our public sector, not least the wages of public sector workers.

Along with retail, our gargantuan public sector is another key source of employment in ‘service economy’ UK. Even more starkly than retail, wealth creation here is all but non-existent. Essential as they are, doctors, nurses, police, fire-fighters, refuse collectors and social workers don’t produce a thing. In fact the whole operation runs at a massive loss. Economically, the kindest thing you can say is they save us money by pre-empting disasters - but that’s a far cry from wealth creation.

Again, from the viewpoint of the Asian factory worker our NHS isn’t an economic entity, it’s pure superstructure. It might not seem this way to a hospital porter, sweating endless hours away on minimum wage. Doubtless the internal feeling is of being a worker, employed by a company called the NHS. As in a factory, both worker and employer wrangle over those timeless issues of pay and conditions. If a compromise cannot be reached there is always the threat of a strike.

But again, we only have to apply the causal premise of Marx’s model to see the true economic relationship here. While it is quite feasible that a general strike in China could bring severe disruption to our NHS, a total walkout of staff at the NHS would have zero effect on the economy of China.
 
This shift has dire implications for anyone holding out for worker-led liberation in the western world. Our own proletariat (if that is still the right term) no longer do hold the reins of production. Relocation of industry has stripped their power. In terms of pulling the plug (or the rug) on capitalism our own proletariat has been pretty much neutered.

This might sound an odd claim just after a successful bin-strike, but in fact that strike is a clear example of how depoliticised public-sector strikes really are. It’s tempting for leftists to portray such disputes as a traditional class fight between workers and tight-fisted capitalists, but in truth who were the evil paymasters? It was us, the residents, including the bin-men themselves assuming they live in the area they serve. Nice as it would be, we really can’t blame Rothschild and Carnegie for this one. Rather than class conflict, such disputes are better understood as the public desire for the impossible – low levels of council tax and clean streets.

‘Sticking it to the man’ loses its potency when the rebel is at one and the same time ‘the man’. It’s the same story across the public sector. Schools, NHS, policing, roads, local services, we are simultaneously employer, employee and customer. If we seriously believe we aren’t paid enough, we can always pay ourselves more. Threats to suspend health care, education and bin collection are really only threats to ourselves. If we want to roll in our own detritus, it’s really our own business. International capitalism will not weep.

And of course there is more to public sector spending than paying people to work. It also supports those who aren’t working. Here again wealth created overseas must be playing a part in propping things up. It is difficult to imagine how it could be possible for a large section of working-age citizens to spend weekday mornings watching  daytime television without financial contributions from more productive nations. Few workers in Bangladesh can indulge this dubious luxury. 

Our long-term unemployed would presumably form a major component of a productive working class, were we a more productive nation. But with welfare payments providing a better standard of living than the wages paid in Asian factories, no domestic factory can compete directly with an equivalent in Asia. Who would sign-off to work in a factory for less money than benefits are bringing in – even if such pay rates were legal, given the minimum wage?

As no one outside the crankiest think-tanks would suggest our own unemployed should accept Asian-level factory wages, we are stuck with the situation as it stands. Our best hope is that Asian workers use their new power – good old class struggle – to win themselves better pay and conditions. Aside from the intrinsic justice, this would also inflate the costs of goods from Asia, and make competition from western manufacturers a possibility again. Until then, the outlook is bleak on both continents. For the long-term unemployed, in our dying towns, and post-industrial cities, life will remain an earthly purgatory, drip fed, at least in part, by the developing world.
 
The decline of manufacturing in the UK has left much of our workforce creating either zero wealth (retail, banking, customer service) or absolute loss, via welfare. When we were sold this new arrangement in the 1980’s it was presented as a mere skills swap, a transfer from one form of productivity to another. But this just doesn’t wash. Contrary to propaganda, no banker has ever made any money. Only people who create, by hand or by brain, add value. Banking isn’t about making money, it’s about getting other people to make money for you. The money that finds its way back into bankers’ and investors pockets is always the fruit of another’s labour. When Chancellor Lawson looked favourably towards the UK’s future as a ‘service economy’ it was really a declaration of parasitic intent. Phrases like ‘Britain now makes much of its money through banking’ are just the finance journalist’s polite way of admitting that Britain now gets citizens of other countries to make its money.

For anyone with a moral interest it’s a curious set-up. Through neither malice nor love, the Asian factory worker simultaneously puts the western worker out of work, pays the benefit cheques, and ends up with a worse standard of living. We pay each other to obtain the luxury goods they produce but cannot themselves afford. While they often struggle to see their children into adulthood, we grow increasingly indignant if our loved ones don’t make it into triple figures.

As our economy morphed from manufacturing to overseas investment, our welfare system itself morphed from something democratic and enlightened, something to be proud of, to something parasitic upon the poor of other nations, every bit as parasitic as the luxury purchases of capitalists. From the perspective of the proletariat of India, discovering that their labour subsidises the UK’s NHS is probably no more comforting than discovering it gets spent on yachts. In both cases, they don’t get to see the benefits. If any state healthcare system is to be subsidised, I’m sure they would choose their own.
 
As a nation we of course have every right to prioritise our own health and welfare if we so wish. There’s nothing immoral in a country deciding to use part of the wealth it creates to provide its citizens with universal healthcare and unemployment benefit. Such policies are widely seen as the great political victories of the 20th century. But the same can’t be said for using wealth extracted from foreign labour. I’m sure Marx would be the first to agree.

Tuesday, February 12, 2013

The Core Disbelief

I can’t vouch for believers, but from the atheist perspective, arguments about religion seem tediously slippery. No sooner than a point seems to be established the terms of the argument magically change. What sounded like a fact turns out to have been only allegory, and it is the atheist who is branded naive for ever thinking otherwise.

I would like to propose a means of avoiding much of this slipperiness, a simple initial question that can be asked of all participants that will hopefully close-off some of these blind alleys. But first, a couple of examples of the problem.

Let’s start with a big one – the nature of God. Listening to Christian prayers and sermons one could be forgiven for imagining a very anthropic being. He listens, thinks, makes, judges and punishes. He bequeaths His only son. Note that this image is not restricted to the Sunday school and the southern Baptist tabernacle, it can be heard on Choral Evensong on BBC Radio Three – arguably the most intellectually high-brow radio station on Earth:

O Lord, have mercy upon us miserable offenders; Spare thou them, O God, which confess their faults, Restore thou them that are penitent, According to thy promises declared unto mankind in Christ Jesu our Lord: And grant, O most merciful Father, for his sake, That we may hereafter live a godly, righteous, and sober life, To the glory of thy holy Name. Amen.

It seems fair to suppose that for some believers this human-like God is taken at face value. In the absence of alternative descriptions, it will surely be the sort of being that coalesces in the mind of any children listening.

However, should one question the plausibility of such a being it immediately evaporates. Nobody actually believes in this sort of God, it seems (nobody of course, apart from the countless millions who still do.) Instead, this anthropic image is turned into mere allegory. The real God is far more intangible. Variously He is life, nature, human love, an ether permeating all the universe, the cement that binds us, or, in line with 21st century consumer choice, any other form the individual believer wants Him to be. ‘God with a beard’, it turns out, was just a foil created by atheist zealots to discredit believers.

A second example concerns the origin of species. For millennia the church propagated the myth of a six day creation. Eventually geology and rationality rendered this story untenable. But instead of simply dropping the idea of conscious creation, the six days are recast as allegory. Much as God’s being has been transformed into a multitude of inexplicable forces, so have His methods: Perhaps he created us via evolution, His powers “having been originally breathed into a few forms” to quote a rather uncomfortable-sounding Charles Darwin, of all people.

With summits as mobile as these it’s no wonder it’s so difficult to plant a flag. One of the chief causes I would suggest is lack of clarity about the basis of belief – what I see as the crucial dividing-line between atheists and believers. I think it can be clarified with the following simple question:

Aside from other humans, do you believe there are any other intelligent agents at play in your life?
 
Extra-human intelligence really is the issue, the dividing-line between the natural and the supernatural. After all, who would care about non-intelligent super-nature? What would that even mean?

If established at the offing, this question can bypass a lot of the slipperiness that tends to follow. But first we must deal with the agnostics – some will question the question itself. Some people will argue that as they cannot be sure whether such forces are at play, the question cannot be answered. This is usually fielded as an intellectual virtue, with Shakespeare wheeled-out to emphasise the profundity: “There are more things in heaven and earth, Horatio”

We can avoid this however with a clarification of what we mean by belief. One might say that I believe in extra-terrestrial life – in the sense of believing it to be statistically likely, given the size of the universe. But clearly this is very different from believing that some aliens crashed at Roswell. The first sense of belief is inert. It doesn’t modify my behaviour toward other humans, not the way that belief in the Roswell ‘incident’ might.

So when I say “do you believe there are any other intelligent agents at play in your life?” I mean ‘believe’ in the strong sense – in the sense that your daily thoughts and deeds might be affected by this belief. Rather than drive the agnostics out of the temple, with this clarification they are given a clear choice to remain or leave. No-one can keep one foot inside.

Armed with this question we can now return to the original two examples. Let’s look again at the nebulous nature of God’s being. Rather than anything as unfashionable as ‘God with a beard’ many of my contemporaries today would be more likely to describe themselves as ‘spiritual’. This is a gloriously slippery term. It could mean they consider themselves to place little value on material wealth. Or it could mean that they believe each human body is animated by an eternal soul.

Asking ‘the question’ cuts right through the fog. If you don’t believe there to be any other intelligent agents at play in your life your notion of spirituality must be a world apart, literally, from those who do. ‘Spirit’ in this context must just be another attribute of physical and mental being. Transcendence, equally, can only be something you achieve within the confines of your own mind, with your feet planted firmly on the ground.

It’s the same for those who claim to believe in ‘karma’. In fact karma is a great illustration. If you answer no to ‘the question’, but also claim to believe in karma, then you can only be referring to an earthly phenomenon, most likely a version of ‘what goes around comes around’: Be unpleasant to your fellow beings and there is a good chance you will suffer in return. Even if those you wrong are unable to exact revenge, your own conscience may settle the score on their behalf – there is no escaping your own feelings of guilt.

Conversely, acting charitably is liable to make other people like you, and lead you to like yourself. ‘A good conscience is a continual Christmas’ as Ben Franklin so sweetly put it. Now see how different this is from the sort of karma that keeps Earl Hickey in check. That is clearly supernatural, requiring super-human intelligence. Something very clever and very powerful must be watching over Earl, and tweaking his destiny accordingly.

It’s similarly useful when we turn to the origin of species. While a six day creation might seem more ludicrous than the idea of God creating things ‘via’ evolution, the difference is in fact superficial. If you answer no to ‘the question’ then you should be prepared to reject both stories with equal vigour. Along with intelligent design, elan vital and spontaneous generation, creation ‘via’ evolution requires an intelligent agent – to do all the planning and designing. If you find it fanciful for an agent to do this in six days, it should seem no less fanciful in all the other cases.

The No camp

When it comes to the thorny subject of assessing the worth of religion, those who answer no to ‘the question’ can be split into two further groups: Those who see no good in religion at all; and those that still value religion despite their own lack of belief. The first group do exist I’m sure, but I should think their numbers are quite small. It’s a rare atheist who rejects every religious act and artefact. Any non-believer one who has witnessed the happiness, consolation and social bonds enjoyed by some believers would need a hard heart and some hard arguments to condemn the whole enterprise outright. If they are not careful it might come across as envy (not exactly a sin in atheism, but not a virtue either.)

But notice the bold distinction here. Anyone who answers ‘no’ to the question but still wishes to defend the worth of religion must now only be doing so purely for its earthly utility. There can be no recourse to pleasing gods or heavenly rewards. Regardless of what the ‘yes’ camp might believe, all benefits remain in the here and now.

Utilitarian justifications for human behaviour are always more complex than they might first appear. Here are a few pointers at the sort of discussions might arise. The first might be an empirical weighing-up of pros and cons. Any religious belief or practice that appears to increase human wellbeing could be seen as a plus, much as any belief that appears to increase human misery could be seen as a minus – just like we assess the worth of secular beliefs. But as always with utilitarianism it isn’t that simple. Different people have different ideas of good and bad. With God’s judgement out of the picture, who gets to decide which effects of religion are beneficial and which are a curse? Do we just vote on it?

Then there is the question of the motives and methods of such non-believers. If you don’t believe in God but do see worth in religion, how should you yourself act? Should you join-in at prayers even though you don’t believe anyone is listening, or just encourage events from outside the church? Both positions have their downsides. There’s something undeniably ludicrous about going through the motions of a religion you have no belief-in (I remember from childhood.) Then again there’s something distinctly paternalistic about championing other peoples’ belief in myths that you yourself don’t believe-in. What might be your motives here? Do you prefer other people to have a cloudier worldview than yourself?

Furthermore isn’t there something fundamentally dodgy about encouraging children to believe things we ourselves believe to be untrue? Isn’t education about doing our best to paint an accurate picture of the world, not a mixture of strict truths and any untruths we deem useful?

The Yes camp

Those who answer ‘yes’ on the other hand are faced with a much larger set of potential benefits. Naturally they can agree with the ‘no’ camp on all the earthly benefits (and like the ‘no’ camp, they will need to produce empirical evidence to defend these claims – the earthly pros will need to outweigh the earthly cons.) But of course they have a lot more to champion than this – perhaps too much. If prayers really can be answered rather than just provide comfort, and an eternal afterlife really does await us rather than just prevent us from despairing about the briefness of life, and if eternal ecstasy or eternal damnation really do swing on the judgement of one who watches over us, then the earthly consequences of religion seem almost insignificant by comparison. 
 
Of course many modern believers will be horrified at being associated with the last of these images. Once again, nobody actually believes in hellfire and damnation anymore (apart from the countless millions who still do.)  But such dreadful misattributions are an unavoidable hazard if you answer yes to ‘the question.’ If you declare a belief in extra-human agency but have no means of proving what it is – and there is no means of proving it – you inevitably consign your beliefs to the same factual basis as any other supernatural belief – ghosts, devils and very ruthless gods. That these other beings might sound absurd or vile, while you believe your own to be sacred and loving, has no bearing on proof. Once you forfeit the claim to rational justification for your belief, you can’t really complain if non-believers make assumptions about your worldview. What can we do but guess?

Fine then, some will say, damn the evidence – my faith is enough on its own. And that is everyone’s prerogative. But in truth this isn’t really enough for many believers, and this is how so many of the exasperating arguments arise. If most believers were truly content with their faith alone, and really didn’t give a fig about empirical support, then such abominations as intelligent design, Lourdes miracles and the virgin birth would not arise. Like it or not, we humans are desperate for verifiable truth. If we have faith in supermen, we can’t help but want to show people evidence of their works, to back that faith up.

So round and round we go. Miracles provide wonderful evidence until they are shown to be illusions – upon which miracles no longer matter, all that matters is the comfort religion provides to the believer. Similarly, a complex trait of an organism provides clear evidence of God’s handiwork until a more mundane evolutionary path is posited. Then the evidence of biology becomes an irrelevance, and only personal faith matters.

This urge to cross back and forth, from the earthly to the supernatural, is surely a consequence of all that religion has lost to rationalism. Only two hundred years ago, with little fear of contradiction, the whole universe could be explained away as the product of superhuman intelligence. For better or worse, rational explanation has now taken over most of that terrain. However one tries to paint it, there has been an enormous loss of ground, and loss of purpose. Widespread belief in a god who hand-crafted the universe has been reduced to belief in a god because that belief in itself provides comfort to the believer (and even that is open to question.) It shouldn’t be too much of a surprise that for some believers this new scheme just isn’t enough to satisfy their needs, and the urge to find empirical evidence for irrational beliefs just won’t lie down.

Wednesday, December 12, 2012

What’s the purpose of Higher Education?

One term in, and those working in Higher Education are doubtless feeling the initial effects of the cancellation of the university block grant, and hiking of tuition fees.

For the uninitiated, the block grant was what the government used to pay to universities, in large part, to subsidise tuition fees for UK and EU students. With its cancellation, this year’s fees have taken quite a leap. Roughly speaking, if you began your degree in September 2011 you will probably be paying around £3,500 this year for tuition. But anyone who started September just gone will be paying £9,000 – not far off the £10K+ already being charged to international students. Such a big increase was bound to have big consequences. For many school leavers the choice now is either to forget all about university, or accept that most of their working life will be spent servicing debt to the Student Loans Company.

For the universities this has raised a range of fears. Here are six of the predicted horrors I have heard mentioned since the cut was first proposed. I will leave it to colleagues in the sector to decide how many are coming true:

  • A decline in the number of applicants
  • A particularly steep drop in EU applicants as they desert the UK for more affordable countries
  • An unspoken lowering of entry requirements to permit more applicants, followed by a lowering of academic standards to retain them (and thereby avoid….)
  • The closure of less popular courses
  • The tendency for HE to become restricted to the children of the materially better-off
  • The transformation of the university degree from a symbol of ability to a luxury product, one that the purchasing customer will fully expect to be delivered once the invoice has been paid.

As they say in question papers, ‘….discuss.’

Rather than get entangled in that, this will be a look at alternatives. How might things be done differently? One option would be no change, or rather a straight swap back – the re-introduction of the block grant and return to the previous cap on fees. This is perfectly possible. Regardless of what politicians might suggest, the UK economy does not stand or fall on this issue. To be tautological about it, this is just what we would be doing now if the whole question had never been raised. Instead we would all be complaining about higher taxes, or worse public services in other spheres, which of course we already are. As long as there are Jaguars on our roads, and swimming pools appearing in back gardens, there is clearly enough money about to subsidise HE. It’s just that subsidies to HE have been identified as money badly spent. At a time of crushing fiscal squeeze HE has been selected to be squeezed out.

One reason is the belief that Higher Education has grown too big. It’s certainly undeniable that it has greatly expanded. Only thirty years ago it was very much an elite enterprise, serving two elite groups: A moneyed elite – the children of the wealthy; and an intellectual elite – academically bright kids, regardless of background. Over the past few decades however, the student body has swollen to include people from neither group. The not particularly wealthy or particularly glowing have been allowed to gain entry. I certainly include myself in this group, and my full BA, ⅓ BEng and ½ MA attest to it. Indeed I can’t think of many of my contemporaries at Poly who fell into either of the original two elites.

However wise this expansion ever was, the current economic disaster supports the arguments of those who always saw it as economically untenable. Indeed the quiet growth of our more inclusive and less elitist HE was always accompanied by the quiet chipping away at the finances that make student life possible. First the Housing Benefit dried-up, then the grants dwindled and the maintenance loans came in, then the tuition fee loans, then the cancellation of subsidies for second degrees. The removal of the block grant is only the latest, most decisive, stage in the process. 

If the government’s aim was to create a smaller HE sector then cancellation of the block grant certainly looks likely to achieve it. But it is hard to see how this is the most intelligent approach. The new system seems to favour mediocre applicants as long as they can pay, while dissuading bright applicants if they can’t. Most rational people would see this as perverse, and slightly demented. Outside the Khmer Rouge there is broad agreement that educational favour should be bestowed upon the intrinsically bright and able. It’s not just a matter of fairness – it’s in everyone’s interests that the excellent have the chance to excel.
                       
So there’s one possibility. Turn the clock back to the 1970s: Raise entry requirements to the point where only an intellectual elite can get in, but then make it free for all who do. Full grants to cover fees and living expenses, and full Housing Benefit. Or if you want a more frugal model we could add a means test, so that the very well-to-do would still have to pay an appropriate fraction. The money recouped could then be used to subsidise the whole enterprise.

Again, this is certainly a workable model, one that worked for decades. But it has its downsides, not least that I never would have gone to university – in itself a national disgrace. But as this is all about reducing numbers, then something or someone has to give, so let's run with it.

Central to support for a return to elite HE is the idea that university really isn't for everyone, even if they might think it is. Perhaps there are things some of us would be better-off doing instead? How many workers genuinely contribute extra value to their organisations through their ability compose a 10,000 word dissertation? Teachers and lawyers certainly, but nurses? Couldn't the world do with less engine designers and more engine repairers, less architects and more carpenters, less MBAs and more small businesses? Might HE subsidies not be better spent on other more vocational forms of training such as apprenticeships or free evening classes?

Besides, if degree certificates aren’t restricted to the intellectual elite what’s so great about degree certificates? If as many people can now write ‘BA’ on a job application as used to write ‘School Certificate’ then hasn’t the degree been devalued to the level of a School Certificate? 

If all that sounds unpalatable here’s an alternative. Rather than be elitist about the students, we could be elitist about the courses. Rather than remove tuition fee subsidies across the board, we could reinstate them for those courses we deem worthwhile. This is obviously a controversial proposition. Different people have very different ideas about what constitutes ‘worthwhile’ study. It all boils down to what you believe to be the point of Higher Education.

Some suggested purposes for Higher Education

If the block grant had been fully and rapidly recouped by extra revenue generated by graduates, we can safely assume that it would never have been cancelled. This hints at one perceived purpose for HE. For many of its critics, the point of HE should be the same as the point of life itself - to generate wealth. ‘Pay your own way’ has been the central political mantra of the past thirty years, and many influential people would like to see it applied as firmly to HE as everywhere else. Any subsidies tax-payers invest in HE should be seen to turn a profit, and sharpish too. Perhaps the panel from Dragons Den can be drafted-in to decide which courses meet the criteria?

Aside from the cultural narrow-mindedness, as a strategy this is completely self-defeating. Even if you do hold that the purpose of HE is to make money, the ethos of the Lancashire cotton mill is not the way to achieve it. The unique opportunity provided by HE is that it gives people the chance to think outside the box, or climb whatever shoulders it takes to see outside the box. The potential wealth created by Universities might take decades, perhaps centuries, to pay out. Research, particularly, necessarily entails a large amount of waste. A thousand blind alleys may need to be surveyed before any treasure is found. Crick and Watson would not have fared well on Dragons' Den, but 60 years on and their discovery is certainly paying out.

A less vulgar, but similarly utilitarian, view is to see HE as a means of staffing the economy - supplying all the doctors and teachers and lawyers and managers society needs. This is still all about the economy, but the wider needs of the economic are also taken into consideration. Such graduates may not generate any wealth directly, but they are needed to keep the rest of us bandaged-up and on course.

A closely related, but possibly more subversive, view is to see HE as a means of engineering a better economy, even a better society. Through the judicious application of subsidy we might wean business graduates off the city and back into manufacture, or wean engineering graduates off arms, oil and aerospace, and onto development of renewables. It depends on your politics which one you pick.
  
Then there are the less quantifiable, more lofty, benefits. Many believe the purpose of Higher Education should be to enrich human existence: University should be about nurturing excellence across the board - art, science, philosophy, literature, without thought to material consequences. While this view does not necessarily preclude profit, it allows for all those courses which never will. Indeed it even makes room for courses that positively oppose profit (a good example is the one degree I did manage to complete. That might well have been subtitled “how to hate capitalism and encourage other people to do the same”.)  Some would argue that such courses are a crucial element of a free and progressive society, and underscore what will be lost if business is given too large a stake in functioning of the universities. If HE is to enrich human culture and broaden our horizons it must maintain some freedom and independence from political and economic pressures.

Finally, a couple of more questionable rationales for the existence of HE, or rather, for going to university. The first is ‘for the fun of it’. Although not widely acknowledged, this has always been a serious pull. Long before the term was coined, many of us went for the ‘student experience’ as much as from a burning desire to learn. We’d seen Chariots of Fire and Animal House and didn’t want to miss out on this rite of passage. All very fulfilling on a personal level, but you can see why some tax payers might not want to subsidise it. If anything, this ‘purpose’ is growing, and is now employed as a recruitment device. Universities are vying with each other to be the hippest, with the hottest social whirl. If, as feared, the current set-up of HE is favouring ability to pay over ability to achieve we can expect increasing emphasis on these extracurricular selling points. Wealthier parents can look forward to packing their underachievers off to a three-year holiday camp(us), much as they buy them InterRail tickets during gap year.

Which brings us to neatly to the last and possibly bleakest purpose of University: To give school leavers something to do. The desire to go to University rather than into the workplace is being supplanted by the need to go to University because of the near-extinction of the workplace. With job opportunities ranging from terrible to non-existent, University is becoming a place for many young people to hide away for a few years. This is another ‘purpose’ that both tax-payer and treasury will not be keen to subsidise. If university has become, for some at least, a less humiliating form of unemployment, then government subsidies to HE unavoidably become welfare payments – another reason for them to be cut. Like the official unemployed, the student ‘unemployed’ are forced to turn to their families to bail them out, or to the money-lender in the form of the SLC.

Any campaign to get subsidies reintroduced into HE will be a struggle. The universities’ strongest hand is their cultural and economic worth, but clearly these terms are open to interpretation, if not abuse. Rather than a dogged insistence on a return to the funding of the recent past, perhaps a more forceful argument can be made for funding a more meritocratic and intellectually elitist HE – however that might be achieved. It’s got to be better than the brand of elitism the current set-up seems to be encouraging.