For the Greater Good

The novel I’m currently working on (along with a related screenplay) is based on a stage play of mine, Singer. The title is a reference to Peter Singer, the well known philosopher and ethicist, who is a proponent of Utilitarianism. This is the view, roughly speaking, that the correct aim of an ethical system is to maximise the enjoyment/fulfilment (however you might measure that) of people in general. It is a provocative stance in that this sounds like a reasonable goal when you first consider it, but if you dig deeper it suggests moral conclusions that work against common moral instincts: consider for example how you might feel about a society that chose people at random and forced them to donate a kidney to save a stranger. It’s also an interesting approach because it presents us with moral challenges that we might normally manage to ignore.

The counter-intuitive cases aside, there’s also a pragmatic objection to the utilitarian ideal, and one I’ve been thinking about lately in relation to the ways modern schools operate. Institutions, particular corporate and government entities, are quick to exploit the language of utilitarianism to suit their own agendas, and what makes this process insidious is, I suspect, very often they don’t even realise they’re doing it.

The corporate world, and increasingly education too, is a very goal focussed beast. Be it increasing shareholder wealth or market share, raising pass rates or lowering a budget deficit, corporate entities are their very best when they have an overarching goal. The behaviours of the various cogs within the machine can then be structured so as to best serve this goal, and decisions regarding resource allocations and policy settings can be made relative to this objective. It makes things, on one level, very simple and correspondingly efficient. The difficult questions (what are we really trying to achieve in this particular situation?) are replaced by a much simpler one, (how will this help us achieve our goal?)

The goal, in Utilitarian parlance, becomes the greater good, in the name of which small sacrifices and discomforts can be justified. And, in my experience, the result is very often disastrous.

The biggest problem, as I see it, is the way the goal, once named, is somehow ushered into the rarified realm of the unscrutinised. In the NZ in the 1980s the Labour Government (and the National one that followed) oversaw a remarkably brutal transformation of the economy, in the name of reducing headline inflation. It was clear at the time, and has subsequently become even clearer, that a massive cost was being incurred in terms of poverty, social dislocation, addiction, mental illness, intergenerational disadvantage, crime and the like, yet the line that was consistently pushed was that all of this was a necessary evil, in the pursuit of a greater good, a price stable economy. These days we have prices that are stubbornly stable, and it is tremendously difficult to argue (with a straight face) that the pay-off was worth the agony. Yet somehow the reason for the transformation escaped proper scrutiny, and part of this, I think, is the beguiling nature of the greater good argument. Suffering in the name of a higher purpose has a sort of nobility associated with it, and makes dullards of us if we are not vigilant.

Modern schools, in the essentially competitive model brought about through the Tomorrow’s Schools reforms, are very good at talking in terms of their greater goals. While these are expressed in many forms, they tend to boil down to two things, higher pass rates, and higher regard within their community. And, as with economic fetishes, both these aims escape serious analysis. Looking good in the community’s eyes a hugely distorting goal, privileging as it does style over substance. The monolithic IT monster has been quick to exploit the opportunity presented. Convince people that their children will suffer if they are not constantly interacting with the latest shiny digital toys, and schools will automatically come running, devoting more and more of their budgets to serving an industry that hardly needs their support, and then wondering why it’s getting harder and harder to balance their books. Ditto the shiny prospectus, or the eye catching road front building project. Marketing is expensive and, in a situation where the value of the product is not in question and the spending therefore not discretionary, something of a pointless waste of everybody’s time and resource.

Raising grades seems more obviously beneficial. We want kids to learn as much as they can, right? Nevertheless, a number of problems are often missed here. The first is the tyranny of the measurable. Very little that is valuable in education is easily measured, and so the more focus that goes on measurable outputs, the more resources will be shifted away from useful education in favour of education that gets ‘results’. A classroom where the overriding concern is drilling students in preparation for examinations is a dreary and uninspiring place to be. It’s also clear that students lose a great deal of autonomy in a grade motivated world. The school’s focus moves from ‘how can I help this student get what they need?’ top ‘how can this student help the school get what it needs?’ (Namely, a better grade profile). In a world where teen anxiety disorders are rising inexorably, this doesn’t feel like a smart move.

Finally, giving that grades mostly serve as a rationing device, schools endlessly competing with one another to have their students outperform others may have remarkably little, if any, net gain. Schools, rather that preparing students for the life ahead as well as they can, can get caught up in a zero game of trying to help their students gain opportunities at the expense of the students in a neighbouring school. Hardly the sort of the mission statement that will attract the next generation of idealists into the profession.

The more schools commit unthinkingly to these types of greater-good goals, the more brutal they become in the treatment of their staff (and weirdly, the more pious they become if challenged – ‘don’t you know, where doing it for the children?’) In New Zealand, the proportion of new staff brought into non-permanent positions is rising, and sadly some schools now treat these in a probationary manner. Budget constraints, very often brought about by a school’s slavish devotion to unexamined goals, have seen support staff in particular, the most vulnerable and underpaid members of the workforce, treated without respect or compassion. The old virtues of loyalty and kindness have been replaced by a utilitarian cry to arms. ‘Yes, we know this seems hard, but we are doing it for the students, for the school.’

It is almost enough to make one nostalgic for the virtue ethics of the past, where acting with honour, kindness and honesty were to be valued above all other goals. Who said history is progressive?

For Goodness Sake

This coming week I’m giving a talk on moral knowledge, and whether our current understanding of science points to a world where moral knowledge is impossible. That’s clearly a wander into the dark world of philosophy, and as with most such discussions, it will all come down to definitions.

The discussion will run much longer than this post, but to summarise the main idea, it goes something like this:

Traditionally people have held the idea that some things are just plain right and plain wrong. It’s good to help out one’s friends and family, not so good to cause them needless pain etc. Societies have, across time and place, developed moral codes which have served the purpose of reinforcing social norms. And, for most of history, people have linked these moral oughts to the existence of some higher force or being. Right and wrong wasn’t so much a social construction as a matter of truth. When an action was deemed morally proper it wasn’t because that’s ‘just the way our people have traditionally thought’ but rather, because it really was the right thing to do. Social obligations and Gods were inextricably linked.

So inextricably linked, indeed, that as some traditions came to question the existence of such Gods, they also questioned the existence of moral truths. This is the existential abyss to which adolescents are so attracted. I conclude there is no God, therefore there is no meaning, no true moral compass, just some dark, threatening void.

From this conclusion we get a modern fashion for moral relativism: looks bad to us, but hey, that’s just because we have a different moral perspective. Who’s to judge, really?

I have two lines of argument on this. The first is that the God argument is something of a red herring. Even if God does exist, we still have good reason to question whether that being’s moral opinions are available to us. Given the observed correlation between physical brain states and thinking states, which is the standard conclusion in current brain science, and the evolved nature of that brain, the current conclusion in biology, and the probabilistically predictable relationship between past and present physical states, the standard conclusion in physics, moral knowledge would require some sort of physical miracle. In other words, to know what God thinks, we would need our brains to configure in a way not consistent with the known scientific description of matter. This might happen on a moment by moment basis within the unknown recesses of our brain, or it might have happened in a single event in the evolutionary past, or as a single historical event, perhaps a God walking amongst us with a prescriptive moral code in hand, but in any case it requires a miracle. One is free to believe any of these things, of course, but for those of us who consider science gives us our best description of the physical world, that solution, and hence moral knowledge, is off the table, irrespective of whether God exists.

My second line of argument will be that it is a mistake to conclude that this lack of moral knowledge implies relativism. In other words, we can still believe in objective moral standards, even if we’re not moral realists. How might we make this case? Briefly, we can observe other areas of human knowledge where we have clear objective facts, despite the area itself not being grounded in some higher reality. Take prime numbers, for example. It is true there is an infinite number of them, this is an objective fact, but this doesn’t mean prime numbers must really exist, independent of our human construction of them. Construct a number set with its basic rules and certain objective facts fall out. Or think of chess. We set up the game with its rules, and from there a number of objective facts emerge (a particular configuration of the board constitutes check mate, for instance).

So, we don’t have a necessary logical link between realism and objective facts. The question becomes, could morality be a bit like chess or prime numbers? The obvious objection is that there is still something arbitrary, and potentially relativistic, about chess. Couldn’t some other group of people play chess with different rules, and so a different set of facts? Well, no, because then they wouldn’t be playing chess. Chess is defined by its rules or constraints. Could morality be?

Here we need to define morality, a step that is too rarely taken seriously (free will is another area where taking a bit of care with definitions would save an awful lot of confusion). Morality, to be meaningful, must refer to ‘oughts’ of behaviour. A moral act is one where there is some form of requirement to act. And so, we need to ask, where does this requirement come from? In other words, why ought we behave morally? If we can’t answer this question, we don’t have a moral system, and we are in fact speaking of something else. The traditional take is that the ought comes from God. We ought to behave morally because God wants us to (and the ancient Greeks had an objection to this – is the thing good because God wills it, or does God will the thing, because it is good?)

But, if we rule out this form of compulsion, on either scientific or logical grounds, where else could the ought come from? Why ought I behave any particular way? There seem to be three inter-related answers: self interest, social interest, and moral intuition. If I have clear instinct about certain moral situations (and in ethics, intuition is treated as the equivalent of empirical data, if a proposed moral system does not accord with our deepest moral instincts we reject it) then that creates the sort of ought a moral system requires. Self interest, the desire to stay alive, or live well, clearly creates a compulsion to act, and because we are irreducibly social in our nature, so too does the requirement to follow the rules that our social system requires in order to remain stable.

Hence, it seems to me to be clearly feasible to ground our moral systems in our evolved instincts, along with the requirements of a stable social system. And as soon as you do that, you no longer live in a world of relativistic, anything goes, morality. Some things are objectively wrong, in that they clash directly with evolved human moral instincts, or they threaten the stability of the social systems humans need in order to flourish.

This is not to say they will reduce to the same simple set of dictums we might find in number systems or chess. Human social systems are vastly more complex, and in many cases there will be parallel solutions to the same moral problem, and there will be cases where the moral imperative is not at all clear. Of course, this is true even if we take the realist line. Believing in moral reality does not relieve us of the burden of endless tortuous discussions, dissections and digressions in the moral field.

So, in essence, if we take a moment to see what moral systems are (a set of plausibly grounded oughts) relativism loses its grip on us: the need to find reason to act severely constrains, and hence objectifies to that degree, the sort of actions we can properly call moral. Or so I will argue.


Celebrating Hope

Sometimes understanding builds slowly: disparate elements slotting slowly into place until the last piece falls and other seemingly unrelated pieces cohere into a whole. In a way, that’s all understanding is, a feeling of coherence. This week a puzzle came together in such a way for me, with the final part being a couple of graduation pieces my Drama colleague and I put together for our Year 13 students: two plays that both looked at the danger of the isolated teenage boy, desperately wanting to connect, yet lacking the skills or confidence to make it happen. Another colleague, on seeing what was a compelling yet bleak performance, asked ‘where were the answers?’

For a great deal of my writing and teaching career, I would have turned to my locked and loaded answer without thought or hesitation – ‘It is not necessarily the place of art to provide answers. It is enough to provide insight and perspective, to provoke thought.’ I still think that’s true, but there is a danger that comes from embracing this perspective too easily. It is the risk of falling into a habit of relentless cynicism, of endlessly rehashing and representing the dark and the dangerous, as if this in itself constitutes a radical and edgy act. From there, it is only a small step to exploiting shock value for the attention it can bring in a crowded market. And yet, cynicism in art is easy. Hope is a much more difficult act to pull off, because when it misses the results are cloying and twee.

Offering answers, in other words, exposes us. To say ‘here’s a problem’ is no stretch. Problems are easily identified and expanded upon, and their very nature is dramatic. Story, in the end, is the playing out of the tension between desire and impediment: the exploration of problems. But, at the point we say not just ‘here’s the problem’ but also ‘here’s how we ought to solve it’ we offer up our inevitable inadequacy: the depth of our ignorance, and also somehow the universality of our frightened desires. We are much more likely to be ridiculed for our solutions than our stating of the problem. And so fear of appearing less substantial than we would like to seem drives us towards the safer expressions of darkness.

Critical response reinforces the effect. Everybody loves the clown, but no one gives them the prize. Scan through the lists of literary titles offered for study in any high school or university, and see how much of it dwells on the darker side of our nature. As if there is something intellectually unworthy about wishing to live well, in love and joy. Which speaks both to the base unhappiness of those doing the choosing (perhaps it is this misery that sends the avid reader in search of an escape form the world that so offends them) and their desperate desire to appear clever in an intellectual, rather than emotional sense.

In all likelihood, a teenager will make it through their secondary years never studying a love story with a happy ending. And yet, it is love, and nothing else, that can lift our lives beyond the drudgery. Finding love and sustaining it is surely the very thing we most wish for our children, and yet God forbid we should let them read about it. How trivial that would be.

The other parts of the puzzle having been accumulating for a while now. Watching my own children grow, and this year writing them their own novel, and feeling how strongly my instincts were pulled towards letting them believe in a world where their avatars could find companionship, security and hope. Studying Brecht with My Year 13s, and facing the inevitable question of what most moves us to the betterment of others, radicalised intellectual response, or personal identification and empathy. Thinking from there about history, and wondering what were the impulses which led to our greatest triumphs? Revolution makes for better epics, but empathy, I suspect, has been the more powerful force in the long run.

Starting a new novel was in there too, having the plan laid out, the story solid, the characters alive ready to lead me through the pages, and noticing how reluctant I was to return to the story, how little heart I had in it.

Going back through some of Hilary Putnam’s offerings in the year of his death, and being struck by his clear assertion that the rejection of an objective morality was the great intellectual fraud of the 20th century, and connecting this to our reluctance in story to embrace the hopeful and the good. But don’t you know, says the mistaken intellectual, goodness is relative. There’s no such thing.

Thinking about my ongoing love affair with The West Wing, the series I always go back to, despite it not having quite the same gravitas or production values as the great series of our time, The Wire for example, or Deadwood. But knowing nevertheless that for me it will always be the better show. Yes the dialogue is smarter, the story lines faster and more gleeful, but more than that, I’m sure, Aaron Sorkin’s true love for the universe he wishes to inhabit offers the viewer the chance to be uplifted. Sure, sometimes it misses and we feel preached to and patronized, but there’s a rare beauty in the moments when he succeeds. And at least he was brave enough to try.

Thinking too of a talk I’ll give next week (September 27th, 12.15, St Andrews on The Terrace) on the nature of morality, and the difference between moral realism, and moral objectivism. I’ll be claiming that the relativists, in rightly rejecting our capacity to access any kind of moral reality, rather threw the baby out with the bathwater when they erroneously concluded that a theory of moral objectivism could no longer be constructed.

There are lives worth living, and what little we can glean of these we must surely shout from the rooftops. And so a challenge lies ahead for this writer. For my next play, and my next novel too, I must celebrate all that is best in us. Anything else is just whistling in the dark.

Happy 21st Terence

The first play I ever wrote and directed was a piece called Terence, the story of a quick witted teen who struggles to play by the social rules and so is treated by suspicion by his peers. He falls in love with a girl called Toni and must muster all his charm, and cunning to find his way into her social world. Ultimately though, he decides the price he is asked to pay to conform is too steep and the play ends with him standing up the girl whose heart he has finally won.

Terence represents as good a point as any when I’m looking for the moment that marks the start of my writing career. Certainly it represents the first time my writing was presented in public. This year Terence turned 21. It was first performed on stage in 1995 by a wonderful cast at Onslow College, two of whom I’ve stayed in touch with, having had the great pleasure of later attending their weddings and meeting their lovely children. Teaching is a rare privilege, much of the time.

Last week I directed Terence again, with another bunch of 15 year olds at Hutt Valley High School. This time we did it as a drama class production, which meant we performed it over three nights with three different casts, and again they did a fabulous job. The lesson I never tire of learning in these situations is the way the particular chemistry of cast and audience, crammed together in a small theatre, produce a show that can not be reproduced. The way an audience member laughs in the first minute, the way a particular actor holds a pause, the way a touch lingers, or a movement across the stage is punctuated by a sigh, all of these things somehow set in train a chain reaction that the actor can feel and feed into, but not control. Over three nights, I saw three entirely different, and each quite splendid, shows. And despite having been the director, working with those actors over the two preparatory months, the particular nature of each performance was ultimately a surprise. And that’s why I love live theatre.

The last of the three performances re-taught me something else: the subtle interplay between comedy and narrative. Terence was always intended as a comedy, and was loaded with as many gags as I thought it  could bear, and part of the actor’s responsibility is therefore to manage the comedy: building to punchlines, trusting them enough to wait for the laugh, and then trusting themselves enough to let that laughing breathe and so on. But Terence is also a love story, and a kind of social commentary. In teen theatre, comedy is often the mechanism by which we bring the audience to the story, making it accessible and indeed palatable. Although I want the kids watching Terence to laugh, the ending is ultimately a contemplative one, and I want them to reflect upon all that is contained in Terence’s ultimate decision.

On the last night, an interesting thing happened. The audience didn’t laugh early on, in the way they had on others nights. The actors were undoubtedly just as funny, and looking down from my position in the  lighting box, I could see the audience were attentive and engaged, but somehow the lead actor was veering more towards a naturalistic, rather than comic performance. I doubt he knew this is what he was doing, but the emerging tone was one of quiet focus (weirdly, although I can’t identify the physical markers, the difference between bored and absorbed silence is obvious even to the stage-light blinded actor). The cast instinctively went with the mood. The performances too a person were more authentic, less adapted to the laugh track. They weren’t better performances, necessarily, the work the comedians did on the opening nights was often outstanding, just different. And the experience of the audience, as a result, was profoundly different too.

On the previous nights, the audience were entertained, whereas on the closing night, they were absorbed. The silence during the final revelation stretched longer, and after the show, they lingered longer too, wanting to talk to the actors and discuss what they had seen. It was a lovely reminder of the compromises involved in genre work. Sometimes genre writers (and Terence is definitely a genre piece) are frustrated that their work isn’t taken seriously, and on one level that’s absolutely fair. To write successful comedy, or crime, or thrillers or musicals for that matter, is every bit as demanding of one’s craft as pure drama. But on the other hand, there’s a sense in which pure drama just is more serious. The more readily we can believe in the people before us on the stage (or screen, or page of a book) the more deeply we will connect with them.

While all theatre demands a level of artifice, genre work explicitly requires the actor to sacrifice the depth of this connection. Every time the actor is asked to burst spontaneously into song, pause to let a laugh run its course, or recap for the audience’s benefits the list of suspects, we are reminded that these people before us are not us, and that we are watching is a contrivance. This hardly relegates the non-naturalistic to a lower theatrical rung. After all the poetry of Shakespeare is as clear an example of theatrical contrivance as you might hope to find, and it’s not hurt his reputation one bit.

All of this makes such an obvious point that it’s slightly embarrassing that I would need to be reminded if it at all. Luckily I work with a tremendous bunch of talented young people, who are more than happy to reteach me these things from time to time.


In Memory

Having recently commented here on the deaths of pop icons Bowie and Prince, or rather the public reaction to them, it would be remiss of me not to note the passing of another great, in March of this year. I remember visiting a student flat in 1997, shortly after the death of Princess Diana, and the residents had cut out the full page newspaper headline ‘the world mourns’ and beneath it pasted the small, half column obituary of a relatively obscure academic who had died at the same time (an academic whose name I have since long forgotten, so rather proving their ironic point).

And so, in the spirit of student idealism, let me pay small tribute to the intellectual giant Hilary Putnam, to my mind one of the most important philosophers of the last century: a thinker whose expertise in mathematics and deep knowledge of science were matched by the sort of restless curiosity and genuine humility that is often referenced but rarely seen. Never afraid to change his mind in public, he always gave the impression of being motivated more by the appeal of the puzzle than trajectory of his career. Putnam played a crucial role in the revival of the American pragmatist tradition, an appealing middle ground between the hopeless extremes of scepticism and foundationalism.

While Putnam’s contribution to modern thought is too broad to encapsulate in a brief remark, I’ll highlight just one idea that seems to me to be tremendously important. Putnam argued in his book The Collapse of the Fact Value Dichotomy, that the logical positivists’ public legacy was a new sense that while science dealt in cold hard facts, areas such as aesthetics or ethics were ultimately subjective, to the point that there was no rational discussion to be had about them, and rather one should simply accept that different people have different views and that’s all there is to it. In other words, it became part of the general western view that, while one could reason carefully about elements of the physical world, the moral sphere was to be approached rather as a matter of personal intuition and taste. This rather played into the hands of the capitalist narrative, whereby collective values are subsumed by the individual’s ambitions, and any attempt by the state to interfere in matters of personal value is to be treated with suspicion.

Putnam pointed out both the dangers in this view, and also the shakiness of its intellectual foundations. Putnam never argued that there weren’t differences between facts and values, indeed he said there are many, but he was at pains to point out these did not amount to the subjective/objective paradigm so often expressed. A relatively easy way to summarise this is to consider the way Putnam represented scientific endeavour. While it is quite reasonable to think of science producing models of the physical world, he was at pains to point out both the fallible nature of these models – they represent, at any given point in time, our best guess on the matter, and are subject to future revision – and the dependence upon theoretical frameworks (and hence values) when it comes to assessing what counts as ‘best’.

So, consider our model of the solar system, which seems as close to a cold hard fact as you are likely to get. If we think about how this particular model established itself as the best way of thinking about the relationship between planets, moon and sun, we see that certain values (simplicity, consistency, coherence, elegance, predictive power etc) all played important roles in the collective decisions that have seen the model first become accepted, and then refined. There are very good reasons for all of these values, it is difficult to see why anyone would want to embrace descriptions of the physical world that were not usefully predictive, or allowed unnecessary levels of whimsical complexity. Accepting that values underpin scientific judgements does not relegate science to the mire of subjectivity, it simply cautions us to be careful in our definitions of scientific facts. Scientific facts are, in a sense, defined relative to the collective values embraced by the scientific community. This notion of knowledge being collectively defined is central to the pragmatist argument. It is also important to note that this does not suggest science is not describing an actual world, and is purely a social construction. Rather, the conventions of science allow it to be tested against that world, and so the theories are constantly interacting with external constraints.

This view of science allows us to be more sensitive to the way value systems are in play, and makes more comprehensible some of the more interesting disagreements in modern science. For example many string theorists are seduced by its mathematical elegance. Others argue, that in the absence of confirming or refuting data, the elegance in itself should not impress us at all. The relative values of elegance and predictive power are hence in play. Similarly, the search for a grand theory of everything is, in part, predicated upon the prior assumption that reality will be best be modelled by a maximally simple and coherent structure, whereas others dismiss this as a wild goose chase, prompted by an undue attachment to the hope that the real world will indeed conform to the values the observers bring to the table. Equally, the very different reaction to quantum conundrums: from the ‘shut up and calculate’ brigade, who are quite satisfied by the predictive capacity, to those endlessly exercised by our inability to produce the coherent background model.

So, we have here a model where science is objective, but where this objectivity flows from the collective values of the community of inquiry. As is clear, then, the parallel with ethical enquiry can be fruitfully explored. A community with a collective sense of goals can meaningfully, and in some sense objectively, construct ethical systems that are true relative to those values. That we can not ground these starting values with any compelling certainty, in either the cases of science or ethics, is to the pragmatist relatively unimportant. That’s just the way it is, they might shrug, and our job is to deal with these limitations, through a process of collective negotiation and exploration. What’s more, we can still make meaningful progress in creating for ourselves moral and scientific frameworks that meet our social goals of collective flourishing. In this Putnam leaves us with a most optimistic legacy.

A fish and a theatre light

I was reminded of a favourite story the other day, when preparing to give a speech to the NZ Association of Scientists at the their annual conference. The point I hope to make was the power of story telling in education, and in particular how the stories of science can be a fabulous hook when it comes to interesting young students  in science. As it happened, I never got to the the story during the presentation, but the  urge to recount it remains:

At the beginning of the 19th century, the popular theory regarding the behaviour of light was that it behaved as a particle. this popularity was due, in no small part, to the fact that Newton had said did, and what Newton said tended to go. Geniuses are never wrong, right?  As is the way in science, there remained a number of unresolved problem with this light theory, and one of them was how to explain the way light bends as it passes through media of different density (think of the way a straight tick put into a stream appears to bend). The French Academy held a competition to explain this. An entry came in from Fresnel, and engineer, who explained it terms of light traveling not as a particle, but as a wave.

For we theatre folk, Fresnel’s name is entwined with the business of lighting the stage, as he invented a lens that is utilised in theatre lighting (and light houses I believe) and still carries his name. The esteemed members of the judging panel were initially sceptical of this unorthodox, outsider’s view, and in particular the renowned mathematician Poisson (he has a probability distribution named after him) is said to have mocked Fresnel’s entry. in order to demonstrate how preposterous it was, he noted how one can not shelter behind a rock to protect oneself from a wave in the ocean, because waves, by their nature, can wrap around the rock and collide behind it.

Going further, he used Fresnel’s own calculations to show that if he was right, it predicted that if you shone a light source directly at a solid object, then at the right scale you would observe the brightest spot of light directly behind the solid light, in the area shielded from the light source. Preposterous.

Luckily for science, another member of the panel, Aragol, thought it judicious to set up exactly this experiment. no surprises for guessing what happened next. The spot appeared, exactly where predicted, and the wave theory of light was resurrected. For the science geek, this story has everything. The historical authority figure, the powerful cheerleader, the preposterous prediction and yet crucially, the experimental observation which triumphs these most potent forces of intertia. In science, or so we hope, data trumps prejudice.

Of course, history shows progress is rarely this clean, but nevertheless messier versions of the principle are the happy norm. Meantime, this case stands for me was, well, a beacon



A Prince and a Duke

In the way the world has of throwing up patterns, this has been the year of the death of musical icons. I’ll not be the first to draw comparisons between David Bowie and Prince, or the last.

Bowie and Prince inhabited very similar places in my musical world, the respectable faces of popular music, so vastly talented that the reach of their appeal could not diminish their cool, even to a fragile, fashion conscious adolescent. Each dominated their respective decades, Bowie ruling the seventies, from Hunky Dory through to the Eno trilogy, and Prince moving in just as Bowie’s musical star faded. 1999, Purple Rain, Around the World in a Day, Parade and Sign o’ the Times surely represents that period’s most startling run of recordings. Along with Bob Marley, Prince provided my generation with a sound track that transcended cultures, something Bowie, even in his pomp, never managed.

And yet, it is hard to escape the feeling that the media treatment of the two deaths is both different, and instructive. The death of Bowie saw not only a devoted front page in my local paper, but an endless sequence of articles over the following days, where journalists and luminaries of a certain age dwelt at length upon the contribution Bowie made to their existence. While the world section of this morning’s paper does lead with Prince’s death, neither the front page treatment, nor the mourning chorus, can be found. In Prince’s case, the coverage is already veering towards the salacious, with more interest in the circumstances of a death, that the grandeur of the life that preceded it. Along with the probing articles on drug use (Bowie’s drug use, like Keith Richards, was more often passed over as the predictable indulgences of fame and youth) there has also been a piece on Prince’s love life. And, as the controversies are brought to the fore, the musical genius fades to off stage. There is something shabby, and sadly predictable, about this difference.

It’s almost too obvious to point out that part of the issue here is race, and the stereotypes indulged by the lazier journalists. The broader point perhaps, is a reminder that the world portrayed by the mainstream media is the world of a few. As it is with music, which is in many ways the trivial case, so it is with the way the spotlight falls when it comes to social and political issues, to the way history is retold, the way our heroes are selected and celebrated. All my life, this has been a reasonably comfortable state of affairs for me, for I am white and male and middle class, and latterly of the age and concerns of the agenda setters. But, for a great number, they will never see their concerns presented back to them: not on the New Year’s honours lists, not on the political campaign, the six o’clock news, not even in the passing of their musical heroes.

One would like to say it’s a sign o’ the times, but the problem feels much older than that.