Eyes on the Prize

s 2016 draws to a close, so too does my 27th year in teaching. Term four is in many ways my favourite of the terms, containing as it does all the very best and worst education has to offer. To start with the positive, this is a term of school camps, and for our drama department, junior productions. Two weeks ago I spent my days at a local bush reserve, orchestrating teams of thirteen and fourteen year olds in a Survivor style competition. They worked together, struggled together, cheered on their rivals, celebrated their victories and comforted the defeated. They held one another, climbed over each other, argued about the solutions to devilish puzzles, shared jokes and collectively took a small step towards becoming caring and well adjusted adults.

This week will finish with three nights of drama productions, just under ninety teens performing three plays in a festival of expression. Again these young people will face their fears, this time in front of a paying audience, and at the end of it will have experienced the special thrill of theatre. They will be proud of all they have achieved, and crucially they will have achieved it as a collective. Their efforts will be in the name of the show, and the privileged act of communicating with strangers, not personal glory. And, because they have not entered the sausage machine of formal assessment, we will not be called upon to rank them from best to worst, as if such a thing is ever meaningful in the arts. No wonder I love my job.

First though, a small cloud hovers overhead, as it does each year, for this is also the season of the school prizegiving. As long as I have taught, I have loathed the event. I remember in my first years of teaching volunteering for car park duty, so as to avoid the inevitable gloom that comes over me on such occasions. Prizegivings represent, for me, all that is wrong with our education system. It is not just that the event is designed such that the last message our students receive from us is ‘most of you weren’t very good’, although God knows that’s problem enough. It’s also the strutting display of our inverted priorities.

Prizegivings are built upon two deeply flawed premises. The first, that the measurables are more important and worthy of celebration than the immeasurables, and second, that to be excellent is to be exceptional. Schools, when they are functioning well, teach a lot of things. They teach young people how to be kind, how to read social situations, how to exhibit self control in times of high stress, how to find intrinsic value in the small moments and activities of everyday life, how to laugh at the absurdities, embrace the complexities, take responsibility for our own failings and face the world with optimism and curiosity. The students who begin to learn these things are lucky, and frankly wonderful. We also teach algebra, and literary analysis. The students who learn these things are, well, good at algebra and literary analysis, and while the former is at least of some practical importance for a small number of our graduates, neither are the sorts of human skills one instinctively feels are worth celebrating. But celebrate them we do, for the simple reason that they’re easier to measure than empathy or humility, and so winners can be chosen.

The second error is even more egregious, for it proposes that so long as you are better at something than anybody else, that something must be a remarkable thing. So we deify Usain Bolt for example, on the grounds that he is marginally faster than the next fastest human being, and significantly faster than most of us (although a touch slower than my pet cat, as it happens). Now I am as wowed as the next man by the magnificent sight of the man in full flow, and was glued to a screen come the Rio Olympics in the hope of watching history be made. But the glory of sport in the end is its triviality. It’s a circus distraction, and a rather wonderful one at that, but surely we all understand, deep down, that the world would not be a better place if we were all just that tiny bit faster when we fully extended.

Compare that to the truly magnificent skills, like parenting. There is something worth celebrating, because when we get that right, almost everything else follows. And it’s hard, ridiculously hard, and those who do master it, which in fact is the vast majority of parents, are truly excellent. Yet, we do not celebrate this form of excellence at all, because we have fallen for the trap of thinking that only the exceptional can be excellent, and so we have failed to attribute value to that which matters most. And of course parenting is one of a hundred examples. Think of the loyal partner, the true friend, the attentive sibling, the dedicated nurse… The people our prize givings tell us not to care about. Better to raise up the anxious, the externally motivated, the socially insecure and the compliant, and cheer them on into their fragile futures.

But happily, it is only one night, and sanity will return soon enough. One school I taught at didn’t even have prizegivings. Perhaps one day far more will have the courage to follow.

Various Positions

And so, in a year of musical losses, another mighty Totara has fallen. For me Leonard Cohen is the pinnacle, not by any measure of quality or greatness, although such an argument could be made, but by the more simple and important measure of what his work has meant to me. Only two weeks ago I used Anthem as the introductory music to my latest play, in doing so running the obvious risk of everything that followed being something of a disappointment (I once heard Lloyd Cole cover Famous Blue Raincoat in a Wellington bar and at it’s conclusion he offered, rather mournfully ‘the problem of covering a Cohen song in concert is you have to follow it with one of your own’).

My relationship with the music of Cohen dates back to 1985, which makes me something of a newcomer, I know. In an interview some years back Nick Cave spoke of first hearing Cohen’s music in his small Australian town, and thinking, I’m not alone. My first encounter, whilst less grand and fecund, had elements of the same, and I’m sure explains part of my obsession with the man’s work. Growing up in a house some 6kms south of Featherston counts as cultural isolation of sorts, and in the world before the internet it was only a Sunday evening fix of Karen Hay that gave any hint of a world more sophisticated than The Eagles or a secondary school reading list (when you’re a seventeen year old boy literary novels appear to have been written almost exclusively by self-absorbed malcontents who have trouble connecting – actually… ).

So, when it comes to Cohen, I really do remember my first time. Karen Hay introduced him as another artist making a comeback that year (the others on the list have faded – Graham Parker?) and so it was me, the television and Dance me to the End of Love. My response, I must admit, was a very adolescent one. Here was a beautiful song: musically simple, with lyrics not just crafted in their own right, but so perfectly matched to the demands of the song and the voice. And yet it was nothing like any of the music I listened to, and I was struck by that most ridiculous yet common of impulses – am I allowed to listen to this kind of music (enjoy this type of drink, wear these clothes, express this opinion, go out with this woman)? Happy to say, the pull of pack was weak on that evening, and the next week I scoured the bins of one of Masterton’s two record stores and Various Positions was mine.

I never knew the album was initially a flop, or that it would be many years before Jeff Buckley’s cover would take Hallelujah to the world. I just knew that a middle aged Canadian Poet in a suit was rocking my world in the way a hundred sneering punks in Docs had never quite managed to do. At some point I took the album to school and played in on the beaten turntable in our Seventh Form Common room. A teacher, perhaps the only original mind on the staff, paused outside and poked her head in the door. ‘What are you listening to?’ ‘Leonard Cohen’. She smiled, as if there was some hope left in the world after all, and left. This was my first glimpse of an underground network of ‘those who love Cohen’, a marker of so many predictable qualities that, along with an appreciation of The West Wing or a fascination with David Hume, is a reliable indicator of the quality of the conversation that might follow. (This particular teacher, how memories come up in a tangle, left us on the last day in her class with the memorable phrase ‘all I’ve ever wanted to do was unsettle every safe thought you’ve ever had.’) Twenty five years later, when Cohen undertook his famous tour(s), part of the magic was, I’m sure, the knowledge that we were most certainly not alone.

Seven years later, holed up in the small room we had claimed in a Tokyo Gaijin house, with only a cassette player rescued from a sidewalk throwout for entertainment, I was rescued by a chap who provided me with a copy of The Future, for me the one album that can challenge Various Positions for top spot (Yes, I know, people love I’m Your Man, but I’m assuming they skip Jazz Police every time, right?) So much did I love the man’s music that I even tried to enjoy Dear Heather. That, at eighty, he would produce another of his truly fine albums in Popular Problems offers hope to all who dare believe dreams need never die.

I’m something of a philistine when it comes to poetry, too much of it comes across as a confidence trick, but that is a flaw of mine, not of the poets. For Leonard though, I make an exception.

Viewing Trump through Loach’s Lens.

Timely perhaps that on the day following the US election I should see Ken Loach’s remarkable film, “I, Daniel Blake”. I won’t pretend to understand American politics at all (although could I have done any worse than the pundits?) Nevertheless it’s very hard, after walking out of Daniel Blake, to see the world in the easy terms of smart people voting one way, and the deplorably ignorant another. A huge economic experiment has taken place over the last thirty or so years, and for all its upsides (a massive reduction in poverty levels in India and China, for example) there is now more than a generation  of entrenched deprivation where for three decades before sustained improvement in living standards had been the norm, and a far smaller number had suffered the hurt and humiliation of economic exclusion. And just as Ken Loach has once again captured the humanity of that accumulating loss, surely the rise internationally of anti-establishment sentiment speaks in part to the political face of the same slow-unfolding tragedy. Yes, there is much to be dismayed by yesterday’s result, and perhaps much to fear, but so too there is plenty of blame to share around. We’ve had an awfully long time to fix this, and yet somehow have contrived to look the other way.

Is Bono a feminist?

Aristotle suggested that virtue is located at the mid-point of two damaging extremes. This is, I think, a most profound and beautiful idea. Rather than thinking of selfishness as an unadulterated evil, with kindness the attendant good, isn’t it much more interesting to think of kindness being a virtue properly moderated by self-interest? At the most basic level, an act of kindness has more value when it is freely given, when the self interested and confident individual chooses to offer their time, support or attention to the needs of another. Kindness without any such element of self-interested choice becomes an act instead of obligation, where the individual serves because they must, where the motivator is not an eagerness to please, but rather a great fear of not pleasing. The giver disappears, and at some point kindness morphs into something much more like exploitation.

This idea can be applied to almost any of our easy dichotomies, and it came to mind recently when reading a vigorous and witty feminist response to the rather surprising news that Bono had been named one of Glamour magazine’s Women of the Year, and then again when a good friend suggested we need new words to replace masculine and feminine. And indeed I think of it every day watching my boys growing up, and noticing the virtues I instinctively cherish in them.

I don’t know much about feminism, certainly not in terms of its current scholarship, but as a teacher and a parent it’s impossible not to form opinions about the ways boys and girls are both enabled and constrained by the gender narratives we weave about them. I’ve been teaching for twenty six years now, and its clear that in this regard some things have changed for the better. In the seventies my older sister was told bluntly by her teacher that girls can’t do physics. At the time it wasn’t considered a controversial statement. Luckily she was a stubborn soul, and persevered anyway in order to fight her way into vet school. That doesn’t happen now. In fact, girls on average outperform boys academically at school level. I teach drama and it’s okay now for boys to want to sing and dance or play real emotions on stage without any of the self-referential irony that was compulsory for male performers when I was a youngster. We’re slightly less hideous towards the non-dominant sexualities, and this gives me some hope. And yet it’s hard not to think that in some ways we’ve slipped backwards.

The internet, which was meant to liberate us by exposing us to the widest possible range of ideas and perspectives, seems to have created instead an echo chamber in which points of view have hardened. The sad and fearful hatred of the anonymous troll has created a level of anxiety that encourages the young not to explore, but rather to conform, to keep their head down. Sexuality has been pornofied, and the curated online self has made a cultural lighthouse of the bragging of the insecure. There’s an awful lot still to do, and I do wonder what it is exactly that most keeps us from doing it.

Here’s two ideas, the first being a return to Aristotle. It seems to me that the zero sum narrative of privilege, that women could only begin to catch up at the expense of men, was at the very least bad marketing. The truth, I think, had we men been able to grasp it, is that there’s an awful lot for everybody to gain if we choose not to view the world through a feminine/masculine filter. That we (by which I mean men) didn’t see that was I imagine due in a large part to the normal resistance to change. We became fixated on what we might lose, and so missed the gains available, if not for us, then at least for our sons.

Given how huge those gains are, that’s a testimony to human stupidity, and reveals the great flaw in our tendency to think only in terms of oppositions. I speak of the lucky young men I see now who are physically comfortable in their world, who will drape themselves over the body of a male friend as they sit listening in class, who are not so terrified of their own sexuality that they will visit it upon their future partners in the form of aggression. I speak of the last seven years of my life devoted primarily to the role of fatherhood (and tears well in my eyes as I write this), of the beautiful connection to one’s children available to any parent with the time to offer. I speak of the immeasurable value of a relationship with one’s partner built first on a friendship of equals. Who wouldn’t want all that? Why exactly aren’t men demanding it?

And here, I suspect, my friend was right. We need new words. Men instinctively resist the virtues of their feminine selves because they fear this might somehow dilute their manhood. But what other words could we use? The best I can come up with, and their inadequacy is immediately apparent, is the assertive and immersive in all of us. We have both the capacity to compete, to be noticed, to demand that we be listened to, and the ability to empathise, placate, share and nurture, to lose ourselves in the world of the other. Aristotle might say the virtue is to be found in the middle ground.

The second idea stems from another field of philosophy, that of free will. Our western Christian tradition likes to think very much in terms of individual responsibility and choice, indeed our entire economic system relies upon the ongoing mythologising of this value system. There is much to be admired in the notion of personal responsibility, of course, but notice how quickly it transforms into blame. If people are free to do as they please, and they do things that are stupid and harmful, then that is their fault. And if it’s their fault, then the way to fix it is to tell them to stop being so stupid and harmful – loudly and repetitively. Until they get it. And as any school teacher will tell you, blaming and ranting doesn’t work (although it makes us feel better sometimes). If you want to change behaviours, you also have to alter the environment that causes those behaviours. Too much of the rhetoric surrounding gender is the rhetoric of blame, of identifying perpetrator and victim, and I’m not sure that leads easily to solutions.

If we don’t like the way men behave, and there are a bunch of good reasons not to much of the time, then the most productive thing to do is to think about how we might change this. If our society remains committed to an almost Victorian level of embarrassed secrecy when it comes to our sexuality, then the pornography industry will rush in to fill the void. And that’s our own stupid fault. All of ours. If we dress our boys in military camouflage and our girls as fairies then yes, they will come to think of themselves as profoundly different from one another. If the world our children grow up in is one where men mow lawns and women cook dinner, then the assertive and immersive will remain forever detached. Perhaps feminism, like charity, starts at home.

And now I hear my one year old son stirring in his slumber, and I must get ready to take him to school with me, for his mother is in Australia, and I rather hope she’s having a party.

For the Greater Good

The novel I’m currently working on (along with a related screenplay) is based on a stage play of mine, Singer. The title is a reference to Peter Singer, the well known philosopher and ethicist, who is a proponent of Utilitarianism. This is the view, roughly speaking, that the correct aim of an ethical system is to maximise the enjoyment/fulfilment (however you might measure that) of people in general. It is a provocative stance in that this sounds like a reasonable goal when you first consider it, but if you dig deeper it suggests moral conclusions that work against common moral instincts: consider for example how you might feel about a society that chose people at random and forced them to donate a kidney to save a stranger. It’s also an interesting approach because it presents us with moral challenges that we might normally manage to ignore.

The counter-intuitive cases aside, there’s also a pragmatic objection to the utilitarian ideal, and one I’ve been thinking about lately in relation to the ways modern schools operate. Institutions, particular corporate and government entities, are quick to exploit the language of utilitarianism to suit their own agendas, and what makes this process insidious is, I suspect, very often they don’t even realise they’re doing it.

The corporate world, and increasingly education too, is a very goal focussed beast. Be it increasing shareholder wealth or market share, raising pass rates or lowering a budget deficit, corporate entities are their very best when they have an overarching goal. The behaviours of the various cogs within the machine can then be structured so as to best serve this goal, and decisions regarding resource allocations and policy settings can be made relative to this objective. It makes things, on one level, very simple and correspondingly efficient. The difficult questions (what are we really trying to achieve in this particular situation?) are replaced by a much simpler one, (how will this help us achieve our goal?)

The goal, in Utilitarian parlance, becomes the greater good, in the name of which small sacrifices and discomforts can be justified. And, in my experience, the result is very often disastrous.

The biggest problem, as I see it, is the way the goal, once named, is somehow ushered into the rarified realm of the unscrutinised. In the NZ in the 1980s the Labour Government (and the National one that followed) oversaw a remarkably brutal transformation of the economy, in the name of reducing headline inflation. It was clear at the time, and has subsequently become even clearer, that a massive cost was being incurred in terms of poverty, social dislocation, addiction, mental illness, intergenerational disadvantage, crime and the like, yet the line that was consistently pushed was that all of this was a necessary evil, in the pursuit of a greater good, a price stable economy. These days we have prices that are stubbornly stable, and it is tremendously difficult to argue (with a straight face) that the pay-off was worth the agony. Yet somehow the reason for the transformation escaped proper scrutiny, and part of this, I think, is the beguiling nature of the greater good argument. Suffering in the name of a higher purpose has a sort of nobility associated with it, and makes dullards of us if we are not vigilant.

Modern schools, in the essentially competitive model brought about through the Tomorrow’s Schools reforms, are very good at talking in terms of their greater goals. While these are expressed in many forms, they tend to boil down to two things, higher pass rates, and higher regard within their community. And, as with economic fetishes, both these aims escape serious analysis. Looking good in the community’s eyes a hugely distorting goal, privileging as it does style over substance. The monolithic IT monster has been quick to exploit the opportunity presented. Convince people that their children will suffer if they are not constantly interacting with the latest shiny digital toys, and schools will automatically come running, devoting more and more of their budgets to serving an industry that hardly needs their support, and then wondering why it’s getting harder and harder to balance their books. Ditto the shiny prospectus, or the eye catching road front building project. Marketing is expensive and, in a situation where the value of the product is not in question and the spending therefore not discretionary, something of a pointless waste of everybody’s time and resource.

Raising grades seems more obviously beneficial. We want kids to learn as much as they can, right? Nevertheless, a number of problems are often missed here. The first is the tyranny of the measurable. Very little that is valuable in education is easily measured, and so the more focus that goes on measurable outputs, the more resources will be shifted away from useful education in favour of education that gets ‘results’. A classroom where the overriding concern is drilling students in preparation for examinations is a dreary and uninspiring place to be. It’s also clear that students lose a great deal of autonomy in a grade motivated world. The school’s focus moves from ‘how can I help this student get what they need?’ top ‘how can this student help the school get what it needs?’ (Namely, a better grade profile). In a world where teen anxiety disorders are rising inexorably, this doesn’t feel like a smart move.

Finally, giving that grades mostly serve as a rationing device, schools endlessly competing with one another to have their students outperform others may have remarkably little, if any, net gain. Schools, rather that preparing students for the life ahead as well as they can, can get caught up in a zero game of trying to help their students gain opportunities at the expense of the students in a neighbouring school. Hardly the sort of the mission statement that will attract the next generation of idealists into the profession.

The more schools commit unthinkingly to these types of greater-good goals, the more brutal they become in the treatment of their staff (and weirdly, the more pious they become if challenged – ‘don’t you know, where doing it for the children?’) In New Zealand, the proportion of new staff brought into non-permanent positions is rising, and sadly some schools now treat these in a probationary manner. Budget constraints, very often brought about by a school’s slavish devotion to unexamined goals, have seen support staff in particular, the most vulnerable and underpaid members of the workforce, treated without respect or compassion. The old virtues of loyalty and kindness have been replaced by a utilitarian cry to arms. ‘Yes, we know this seems hard, but we are doing it for the students, for the school.’

It is almost enough to make one nostalgic for the virtue ethics of the past, where acting with honour, kindness and honesty were to be valued above all other goals. Who said history is progressive?

For Goodness Sake

This coming week I’m giving a talk on moral knowledge, and whether our current understanding of science points to a world where moral knowledge is impossible. That’s clearly a wander into the dark world of philosophy, and as with most such discussions, it will all come down to definitions.

The discussion will run much longer than this post, but to summarise the main idea, it goes something like this:

Traditionally people have held the idea that some things are just plain right and plain wrong. It’s good to help out one’s friends and family, not so good to cause them needless pain etc. Societies have, across time and place, developed moral codes which have served the purpose of reinforcing social norms. And, for most of history, people have linked these moral oughts to the existence of some higher force or being. Right and wrong wasn’t so much a social construction as a matter of truth. When an action was deemed morally proper it wasn’t because that’s ‘just the way our people have traditionally thought’ but rather, because it really was the right thing to do. Social obligations and Gods were inextricably linked.

So inextricably linked, indeed, that as some traditions came to question the existence of such Gods, they also questioned the existence of moral truths. This is the existential abyss to which adolescents are so attracted. I conclude there is no God, therefore there is no meaning, no true moral compass, just some dark, threatening void.

From this conclusion we get a modern fashion for moral relativism: looks bad to us, but hey, that’s just because we have a different moral perspective. Who’s to judge, really?

I have two lines of argument on this. The first is that the God argument is something of a red herring. Even if God does exist, we still have good reason to question whether that being’s moral opinions are available to us. Given the observed correlation between physical brain states and thinking states, which is the standard conclusion in current brain science, and the evolved nature of that brain, the current conclusion in biology, and the probabilistically predictable relationship between past and present physical states, the standard conclusion in physics, moral knowledge would require some sort of physical miracle. In other words, to know what God thinks, we would need our brains to configure in a way not consistent with the known scientific description of matter. This might happen on a moment by moment basis within the unknown recesses of our brain, or it might have happened in a single event in the evolutionary past, or as a single historical event, perhaps a God walking amongst us with a prescriptive moral code in hand, but in any case it requires a miracle. One is free to believe any of these things, of course, but for those of us who consider science gives us our best description of the physical world, that solution, and hence moral knowledge, is off the table, irrespective of whether God exists.

My second line of argument will be that it is a mistake to conclude that this lack of moral knowledge implies relativism. In other words, we can still believe in objective moral standards, even if we’re not moral realists. How might we make this case? Briefly, we can observe other areas of human knowledge where we have clear objective facts, despite the area itself not being grounded in some higher reality. Take prime numbers, for example. It is true there is an infinite number of them, this is an objective fact, but this doesn’t mean prime numbers must really exist, independent of our human construction of them. Construct a number set with its basic rules and certain objective facts fall out. Or think of chess. We set up the game with its rules, and from there a number of objective facts emerge (a particular configuration of the board constitutes check mate, for instance).

So, we don’t have a necessary logical link between realism and objective facts. The question becomes, could morality be a bit like chess or prime numbers? The obvious objection is that there is still something arbitrary, and potentially relativistic, about chess. Couldn’t some other group of people play chess with different rules, and so a different set of facts? Well, no, because then they wouldn’t be playing chess. Chess is defined by its rules or constraints. Could morality be?

Here we need to define morality, a step that is too rarely taken seriously (free will is another area where taking a bit of care with definitions would save an awful lot of confusion). Morality, to be meaningful, must refer to ‘oughts’ of behaviour. A moral act is one where there is some form of requirement to act. And so, we need to ask, where does this requirement come from? In other words, why ought we behave morally? If we can’t answer this question, we don’t have a moral system, and we are in fact speaking of something else. The traditional take is that the ought comes from God. We ought to behave morally because God wants us to (and the ancient Greeks had an objection to this – is the thing good because God wills it, or does God will the thing, because it is good?)

But, if we rule out this form of compulsion, on either scientific or logical grounds, where else could the ought come from? Why ought I behave any particular way? There seem to be three inter-related answers: self interest, social interest, and moral intuition. If I have clear instinct about certain moral situations (and in ethics, intuition is treated as the equivalent of empirical data, if a proposed moral system does not accord with our deepest moral instincts we reject it) then that creates the sort of ought a moral system requires. Self interest, the desire to stay alive, or live well, clearly creates a compulsion to act, and because we are irreducibly social in our nature, so too does the requirement to follow the rules that our social system requires in order to remain stable.

Hence, it seems to me to be clearly feasible to ground our moral systems in our evolved instincts, along with the requirements of a stable social system. And as soon as you do that, you no longer live in a world of relativistic, anything goes, morality. Some things are objectively wrong, in that they clash directly with evolved human moral instincts, or they threaten the stability of the social systems humans need in order to flourish.

This is not to say they will reduce to the same simple set of dictums we might find in number systems or chess. Human social systems are vastly more complex, and in many cases there will be parallel solutions to the same moral problem, and there will be cases where the moral imperative is not at all clear. Of course, this is true even if we take the realist line. Believing in moral reality does not relieve us of the burden of endless tortuous discussions, dissections and digressions in the moral field.

So, in essence, if we take a moment to see what moral systems are (a set of plausibly grounded oughts) relativism loses its grip on us: the need to find reason to act severely constrains, and hence objectifies to that degree, the sort of actions we can properly call moral. Or so I will argue.

 

Celebrating Hope

Sometimes understanding builds slowly: disparate elements slotting slowly into place until the last piece falls and other seemingly unrelated pieces cohere into a whole. In a way, that’s all understanding is, a feeling of coherence. This week a puzzle came together in such a way for me, with the final part being a couple of graduation pieces my Drama colleague and I put together for our Year 13 students: two plays that both looked at the danger of the isolated teenage boy, desperately wanting to connect, yet lacking the skills or confidence to make it happen. Another colleague, on seeing what was a compelling yet bleak performance, asked ‘where were the answers?’

For a great deal of my writing and teaching career, I would have turned to my locked and loaded answer without thought or hesitation – ‘It is not necessarily the place of art to provide answers. It is enough to provide insight and perspective, to provoke thought.’ I still think that’s true, but there is a danger that comes from embracing this perspective too easily. It is the risk of falling into a habit of relentless cynicism, of endlessly rehashing and representing the dark and the dangerous, as if this in itself constitutes a radical and edgy act. From there, it is only a small step to exploiting shock value for the attention it can bring in a crowded market. And yet, cynicism in art is easy. Hope is a much more difficult act to pull off, because when it misses the results are cloying and twee.

Offering answers, in other words, exposes us. To say ‘here’s a problem’ is no stretch. Problems are easily identified and expanded upon, and their very nature is dramatic. Story, in the end, is the playing out of the tension between desire and impediment: the exploration of problems. But, at the point we say not just ‘here’s the problem’ but also ‘here’s how we ought to solve it’ we offer up our inevitable inadequacy: the depth of our ignorance, and also somehow the universality of our frightened desires. We are much more likely to be ridiculed for our solutions than our stating of the problem. And so fear of appearing less substantial than we would like to seem drives us towards the safer expressions of darkness.

Critical response reinforces the effect. Everybody loves the clown, but no one gives them the prize. Scan through the lists of literary titles offered for study in any high school or university, and see how much of it dwells on the darker side of our nature. As if there is something intellectually unworthy about wishing to live well, in love and joy. Which speaks both to the base unhappiness of those doing the choosing (perhaps it is this misery that sends the avid reader in search of an escape form the world that so offends them) and their desperate desire to appear clever in an intellectual, rather than emotional sense.

In all likelihood, a teenager will make it through their secondary years never studying a love story with a happy ending. And yet, it is love, and nothing else, that can lift our lives beyond the drudgery. Finding love and sustaining it is surely the very thing we most wish for our children, and yet God forbid we should let them read about it. How trivial that would be.

The other parts of the puzzle having been accumulating for a while now. Watching my own children grow, and this year writing them their own novel, and feeling how strongly my instincts were pulled towards letting them believe in a world where their avatars could find companionship, security and hope. Studying Brecht with My Year 13s, and facing the inevitable question of what most moves us to the betterment of others, radicalised intellectual response, or personal identification and empathy. Thinking from there about history, and wondering what were the impulses which led to our greatest triumphs? Revolution makes for better epics, but empathy, I suspect, has been the more powerful force in the long run.

Starting a new novel was in there too, having the plan laid out, the story solid, the characters alive ready to lead me through the pages, and noticing how reluctant I was to return to the story, how little heart I had in it.

Going back through some of Hilary Putnam’s offerings in the year of his death, and being struck by his clear assertion that the rejection of an objective morality was the great intellectual fraud of the 20th century, and connecting this to our reluctance in story to embrace the hopeful and the good. But don’t you know, says the mistaken intellectual, goodness is relative. There’s no such thing.

Thinking about my ongoing love affair with The West Wing, the series I always go back to, despite it not having quite the same gravitas or production values as the great series of our time, The Wire for example, or Deadwood. But knowing nevertheless that for me it will always be the better show. Yes the dialogue is smarter, the story lines faster and more gleeful, but more than that, I’m sure, Aaron Sorkin’s true love for the universe he wishes to inhabit offers the viewer the chance to be uplifted. Sure, sometimes it misses and we feel preached to and patronized, but there’s a rare beauty in the moments when he succeeds. And at least he was brave enough to try.

Thinking too of a talk I’ll give next week (September 27th, 12.15, St Andrews on The Terrace) on the nature of morality, and the difference between moral realism, and moral objectivism. I’ll be claiming that the relativists, in rightly rejecting our capacity to access any kind of moral reality, rather threw the baby out with the bathwater when they erroneously concluded that a theory of moral objectivism could no longer be constructed.

There are lives worth living, and what little we can glean of these we must surely shout from the rooftops. And so a challenge lies ahead for this writer. For my next play, and my next novel too, I must celebrate all that is best in us. Anything else is just whistling in the dark.