Phife Dawg: a tribute

A version of this article was published in The Conversation on March 24th, 2015.  

Like many people around the world yesterday, I was saddened to hear the news of the death of legendary 90s hip-hop artist Phife Dawg. I was sad not because I knew Phife Dawg – I didn’t – nor because he was too young to leave us – although he was – but because Phife Dawg and A Tribe Called Quest was part of an aural landscape that formed the soundtrack to my teenage years.

But even that’s not why.

You see, hip hop has always had a bad rap. Originating in New York in the 1970s, the genre was largely associated with the influx of Jamaican migrants such as MC Kool Herc, who brought with them the dub technique of MCing over two records played simultaneously. Hip hop evolved as a fundamentally creative response to the lived environment, which largely meant young black men – and women – working with what they had. Just as earlier generations of poor african americans – especially those in more rural locations of the south – improvised musical instruments from found objects, so early hip hop artists utilised pre-existing vinyl records, the urban environment and their own bodies to create something that was utterly new and innovative. Hip hop was more than just rapping; it was a culture that included breakdancing, cutting two (or more) records together, and graffiti art.

If this phenomenon needs to be broken down further, we can do just that; hip hop was an unconscious but ingenious strategy of dismantling and reinventing the world as the B-Boys knew it. Early rapping tended to take the form of ‘battles’, with each MC attempting to outdo his peers in a spirit of one-upmanship, using a combination of lyrical wit and unexpected, comical rhyming patterns. As such, the early hip hop records of the late 70s and early 80s – The Sugarhill Gang’s ‘Rapper’s Delight’ is often cited as one of the first – were often upbeat, good-humoured and full of witticisms.

As tends to happen when new musical genres evolve, the early 80s witnessed a good deal of creative fluidity between styles that would soon splinter irrevocably; early house, techno and electro music bubbled out of the same pot as hip-hop, using the same techniques of cutting between records and sampling beats. As much as we take this for granted now, this was cutting edge. In terms of the lyrical content of early rap, social commentary began to creep in (Grandmaster Flash’s ‘White Lines’, about inner city poverty and drug-use being just such an example) but lyrics were equally likely to be about outer space or the Cold War.

By the late 80s, hip hop as a discrete musical genre had established itself and was no longer associated with graffiti or breakdancing. It was at this time that hip hop began to be viewed negatively by the mainstream press, principally because as its appeal grew more widespread (meaning that commercial artists such as MC Hammer and the much-maligned Vanilla Ice also found huge success at this time) it was finding its way into the homes and aural cavities of suburban white teens and their parents. Truth be known, the obligatory Parental Advisory label probably served as a badge of recommendation to teenagers playing with subversion – as teenagers do. But that’s by the by. If there was something explicit about the lyrics it was that, increasingly, they had something to say. And they said it. The hip hop of the late 80 and early 90s was explicit in its message, in a way that recorded music simply never had been.

Artists such as Public Enemy were amongst the most outspoken political messengers, and marked the blossoming of a brief but exquisitely powerful trend in conscious lyricism and poetry that still has no parallel across any other musical genre. It really doesn’t. For all the blanket criticisms of rap as being misogynistic, homophobic and violent, its truly revolutionary aspects have been largely overlooked. Often harking back to the civil rights movements of the 60s, the new sound referenced the profound cultural loss of an authentic homeland – Africa – and spoke of dislocation and alienation. A Tribe Called Quest along with groups like Gang Starr embodied a less aggressive form of musical messaging that had more in common with earlier rap in terms of its playful lyricism and musicality than it did with the newly-emerging spectre of gangster rap. Phife and his peers used a combination of layered melodic and atonal jazz samples to create an introspective and even intellectual sound that would later be picked up by Nas in his landmark 1994 album Illmatic; Phife’s band-member Q-Tip was part of the production team.

Hip hop’s golden age may be over, and perhaps that’s why I’m especially saddened by the death of Phife Dawg, representing as he does the passing of an age when hip hop still contained the germ of revolution. We may not be able to recreate that age; but if we can even begin to recognise and celebrate it for what it was, we just may have the beginnings of a suitable epitaph for one of Hip Hop’s finest sons.

A meeting with Lord Bird

I met with Lord Bird, the founder of The Big Issue, on the evening before his maiden speech at the House of Lords. This unpublished article was the result of our meeting. 

Thursday, February 25th: 5.45pm. I’m sitting with John Bird on the platform at Sloane Square tube station. Sloane Square might be posh above ground but the underground is like a preparatory sketch for T.S. Eliot’s ‘The Wasteland’. Wedged onto a cold metal bench, with trains rushing in and out on waves of sound and grimy air; commuters spilling off and on, glassy-eyed as pigeons; we’ve got fifteen minutes before he catches his train. Fifteen minutes during which he’s going to tell me his entire plan to change the world, or to use the plan’s working title: ‘Why I’m becoming a life peer’. Because John Bird, ex-offender, ex-streetkid, second-generation Irish immigrant and founder of the Big Issue, is now Lord Bird: Baron Bird of Notting Hill. In the morning he will deliver himself up to the House of Lords for his maiden speech. A good Friday indeed. Tonight, though, he has an important appointment; a last supper, with family and friends, presumably with both bread and wine. And is he worried, at all?

‘I don’t worry,’ says Bird, confidentially. ‘I’m great with a hangover.’

He folds his arms over his chest. ‘Go on then, ask me some questions. Get your pen out.’ But before I open my mouth Lord Bird commences to give his answers in advance.

‘People just haven’t understood what I’ve been trying to do with the Big Issue,’ he says, with an air of frustration rather than annoyance. ‘In 1991 there were 50 or 100 or 1000 charities supposedly helping the homeless by giving them tai chi and yoga and yoghurt and what have you. But people need to have a chance to make money. The whole point was to lift them out of crime and poverty. Not keep people in it.’

I remark that, in my view, underlying issues of poverty and class as markers of discrimination have been massively sidelined by comparison with, say, race and gender – despite the fact that they’re all intertwined. There was a larger point here that I didn’t quite manage to make, but Bird nods thoughtfully and says that in the UK a white, male homeless person is often concretely worse off than if he were, say, black or female – ‘or even gay, or mentally ill – because there are structures in place, to acknowledge people from certain groups. But if you’re from the “host country”, as it were, you’re cattle trucked.’ That wasn’t exactly what I’d meant, but the point was a fair one.

So does he consider himself a socialist? ‘Everyone’s a socialist these days,’ says Bird, in a tone that’s more matter-of-fact than contemptuous; ‘unconsciously at least. Everyone anal about equality. But the Big Issue was never intended to be socialistic. Whatever my personal politics are, I don’t bring it with me. Believe it or not, the Big Issue was originally started in opposition to white, middle class liberalism.’

I’m about to ask him what he means by this, but he’s already telling me.

‘What I mean by that is – let me explain. The twentieth century had four forms of idealism, borne out of the First World War: Fascism, Marxism, Consumerism (or pop capitalism, as I call it) and Liberalism. And what’s Liberalism? You can be anything you like, as long as it’s liberal.’

A Henry Ford dystopia? A fascist-Marxist-consumerist hybrid…

‘So what you end up with is this tokenism. But  Liberals are defending people’s rights to box themselves into poverty. And poor people are treated as if they’re from another species.’

Bird clears his throat and folds his arms tighter across his chest. ‘I’m getting a sore throat,’ he says, with a tinge of anxiety, ‘too much talking. I can’t stay long. My wife’s got this meal waiting. Do you know Mondrian?’ With one finger he paints straight lines and squares in the air. ‘My wife’s a Mondrian painting. I’m a Van Gogh.’

I suggest that some people might think it’s hypocritical of him to take a seat in the House of Lords, if he’s committed to helping the poor and averse to politics.

‘Listen,’ he says, ‘I am going into the House of Lords to bring the question of poverty right into the heart of government. Both the left and the right have done incredible harm. The party political line has done us no good. Look – I can sit with a David Cameron or a Jeremy Corbyn and take all their bullshit with a pinch of salt, if it means something gets done. But none of them are addressing the big issue – the really big issue. And that is, why both the left and the right are committed to keeping poor people in poverty.’

The idea for the Big Issue, he explains, had originally come from the States. While visiting the U.S., Gordon Roddick – who with wife Anita founded The Body Shop – had encountered a man (described as looking ‘like a wardrobe’) selling street papers for a dollar. The street vendor explained that he had ‘a drug habit’ – but by buying papers for 50 cents and selling them for a dollar, he could keep himself out of trouble. That’s to say, he stillness drugs – he just didn’t have to rob anyone to pay for it. Or go to jail. ‘Just like those folks up in Wall Street,’ the vendor had said, ironically. ‘They go to work, earn their money and when they need drugs they just call up their dealer and get some.’ This was at the time when America’s controversial ‘war on drugs’ was effectively creating a one-way pipeline of people from America’s urban underclass straight into the penal system. No return tickets.

Roddick came back to the UK knowing that the same scheme could work here; and that his friend John Bird was the man to do it. ‘I’ve got a background in printing,’ explains Bird. ‘And Gordon knew that when it comes to poverty, I don’t have a sentimental bone in my body.

‘What I say is this: don’t allow the poor to be poor. Question the way the poor are kept poor.’

So how would he describe himself?

‘I’m a revolutionary.’

A revolutionary or a reformist?

‘I’m a revolutionary reformist.’

Bird glances up at the station clock. ‘Right – I’ve got to go. Have you got everything you need?’ I have, and now he’s hopping onto the train, barely fitting into the crowded carriage, his heels not quite over the threshold. But with a bit of jostling, he’s in. Before the doors slide shut, he turns and gives me a thumbs up; then the tunnel swallows him up.

Out on the street, a man approaches me; he’s overdrawn his Oyster card and can’t afford to get home. ‘I have to break my pride and ask,’ he begs, over and over again, ‘or I’ll have to stay here all night.’ It’s freezing cold, but I’m not all that flush myself. I give him a couple of pounds – not enough – and walk away. Minutes later I remember I’ve got a valid travelcard in my pocket that he could have used; I run back to find him, but already it’s too late. He’s gone.

Zombies, pride, prejudice – and amnesia

A slightly different version of this article was published in The Conversation on February 19th, 2016.

When George Romero created the first modern zombie flick in 1968, he hadn’t imagined his zombies as – well, as zombies.

“To me back then, zombies were those voodoo guys who were given some sort of blowfish cocktail and became slaves. And they weren’t dead so I thought I was doing a brand new thing by raising the dead.”

But since Night of the Living Dead, the spectre of the zombie as a brain-eating, revivified corpse has become a pop-culture staple. If our screens are to be believed, the zombie apocalypse is now; even Amazon’s terms of service have recently been updated to cover just such a catastrophe. And this week sees The Walking Dead’s sixth mid-season premiere, coinciding with the release of Hollywood’s latest: Pride and Prejudice and Zombies.

Putting a zombie spin on a classic novel by Jane Austen seems like it ought to be a unusual twist. But it isn’t. In 1943 film director Jacques Tourneur based the narrative structure of I Walked With a Zombie on Charlotte Bronte’s 1847 novel Jane Eyre – and well before Jean Rhys wrote her famous prequel to Bronte’s novel set in post-slavery Jamaica. But I Walked With a Zombie, for all its schlock-horror aspirations, is a genuinely haunting piece of film history. Tourneur’s zombie wasn’t the flesh-eating living dead kind – the kind that, after all, Romero never intended to be called zombies – his was an actual, Caribbean zombie; the somewhat tragic figure of a human being maintained in a catatonic state – a soulless body – and forced to labour for whoever had cast the spell over him or her. In other words, the zombie is a slave. I always find it troubling that, somewhere along the line, we either forgot or refused to acknowledge what a zombie actually is. Instead we’ve replaced him with the figure of a mindless carnivore, and one that reproduces, virus-like, with a bite.

But, wait. There is no singular zombie tradition from the Caribbean. The word ‘zombie’ itself has a number of possible origins, with similar words being found all across West Africa – which of course, is precisely where the slaves came from – and meaning anything from ‘devil’ to ‘spirit’ to ‘Creator God’ (Congolese ‘Nzambi’). There is even a word in the indigenous Caribbean Arawak language – ‘zemi’ – which refers to an ‘ancestral spirit’ and which has also been cited as the etymological source. And with the fluidity that characteristics folklore and shifting local traditions, there are different kinds of zombi too, and the word can sometimes be used to suggest a spirit or ghost not unlike the locally-terryifing Jamaican ‘duppy’ (evil spirit). But the most resonant and persistent tradition of the zombie is one who has been conjured into a soulless state and forced to labour. Crucially the zombie has no memory of who he was previously, nor understanding of what he has become, and is whipped and exploited cruelly, and fed only on meagre rations. And yes – precisely like a slave.

An even earlier rendering of the filmic zombie came in the shape of White Zombie in 1932. Starring the veteran horror actor Bela Lugosi, the film largely falls flat but – to me at least – was notable for a single scene set inside a sugar mill. In this scene, slaves – or zombies, we’re not sure – work the mill. Nobody here is staggering about in varying degrees of decomposition or attempting to feast on brains; they simply turn the machinery, around, and around, and around. It’s an uncanny, deadening scene. Are they zombies, or just slaves? Either way, they are mindless, dead; slaves who do not remember who they were; who do not know their names; who are unconscious; who exist only for exploitation and labour. Because here’s the thing: in a direct inversion of our now-familiar flesh-eating zombie narrative, the Haitian zombie is not a predator but is afraid of people. His docile, cringing subjection is absolute, but there is one proviso; you must never feed him salt, for if you do, he will remember. He will remember who he is and who he was and everything that has been done to him – and then he will slave no more. I recounted this once to a classroom full of students and one remarked – with great perspicacity – that to taste salt was akin to tasting tears.

We can be pretty sure that the zombie is not a wholly Caribbean invention, but arrived with the slaves from Africa; because we must remember that slavery was systemic within Africa too, although its impact and supporting ideologies shifted dramatically – and devastatingly – once transported west. But this is not a rant about ‘cultural appropriation’ or ‘cultural erasure’ – although conceivably it could be, either or both – it’s simply a call to memory, which is precisely what the zombie does not have. Memory. It’s a call to memory because the zombie – the actual zombie – reminds us of something very important. It reminds us to remember – who we are, and where we came from, and how we came to be – individually, collectively – especially for those of us whose personal and community histories are caught up in the blanketing fog of cultural amnesia. The zombie reminds us to taste salt.

Joint Enterprise: a conspiracy of injustice

A version of this article was published in The Big Issue on March 11th 2016.

Five years ago I was called to jury service. I was soon to find out that, for me at least, jury service involved a lot of waiting. In the end I was only in court for one day, and this day came right at the end of two weeks of state-enforced purgatory.

I say purgatory. The couple of hundred members of the public who, like me, were there to do their civic duty were marshalled into the equivalent of a giant dentist’s waiting room; instead of dentists were judges, somewhere, secreted in the building, dispensing remedial indifference. We were given meagre entertainments: dominoes, jigsaws and dog-eared copies of ‘Chat’ magazine. Every so often names were called out, and twelve people would stand and leave their games of scrabble (what now, for the triple word score?); they filed away in silence. The rest of us simply waited.

After days of joyless limbo my name was finally called. I stood outside the courtroom, vaguely thrilled after the unremitting excitement of board games. I wondered what kind of criminal I’d be confronted with; I pictured the possibilities. But the one thing I did not expect to be confronted with was a child.

The defendant in the case was only thirteen years old. He was accused of participating in a robbery. In fact he’d only been in the shop at the same time as the robbery took place. The 16 year-old boy shown on CCTV committing the alleged crime had been picked up by police the same day, but was released without charge because he’d refused to answer any questions. His thirteen year-old alleged accomplice was arrested several months later, and most of the case against him seemed to hinge on his inability to recall under pressure whether or not he’d gone to play football in the park one day (was it Wednesday? Or Thursday?) three months previous.

The flaws in this case, as far as I was concerned, were many. Putting aside the fact that the boy’s barrister did not once remember the boy’s name correctly; putting aside the fact that this child – who, I can only assume, had learning difficulties – was not able even to read the oath that was placed before him; putting aside the fact that this child, who was in foster care, did not have a single family member or supporter in court with him; putting aside the fact that the key witness in the case could not speak english and neither could the court-assigned interpreter; putting aside all these facts and more, the one incontrovertible fact of the case was, as far as I was concerned, the FACT that he clearly did not commit the crime and there was no way of knowing, neither from the CCTV, nor from the paucity of other evidence, whether or not he had prior knowledge of the older boy’s intentions.

At least, I thought that was the one incontrovertible fact. My fellow jurors were not necessarily of the same mind. I am not allowed – even five years on – to discuss what happened in the deliberating room, but suffice it to say, after a day of arguing and ultimate stalemate – in an airless room with next-to-no refreshments – we trooped back into the courtroom twice unable to reach unanimity. Nerves were frayed; time was passing; civic duty was threatening to push the boundaries of expedience. But in the end it didn’t matter. Because finally the judge explained to us that 1. he would accept a majority verdict, and 2. under the terms of Joint Enterprise, we had little option but to find the defendant guilty, simply because he was there at the time.

So we did. We found him guilty. I hope that kid took some solace in the fact that, despite the judge’s directive, we still did not reach unanimity. We came back with a verdict of 11–1. One of us said ‘Not guilty’. But only one. So all the jurors went home in time for tea, and a parentless child was given a custodial sentence.

Was it fair? No. Could I do anything about it? No. I was impotently complicit in what was, I felt, a grave injustice. The farcical Joint Enterprise law was one thing. Equally disturbing were the prejudices and lack of appropriate guidance we jurors took into the deliberation room with us. But most disturbing was the direct observation that the cogs of law and order turn on half-truths and sleight of hand enacted against the most vulnerable members of our society. The recent Supreme Court ruling on Joint Enterprise may be a step in the right direction, but if so serious a flaw can remain unchallenged for thirty years, then what else are we tolerating?

David Cameron, my dad and me

This article was first published in The Guardian on January 28th 2016.

The British public tends, as a rule, to dislike toffs. It’s why fox hunting occasionally escalates to an issue of national crisis, being symbolic of a deeply entrenched class antagonism. And it’s why tales of David Cameron’s alleged antics during his Oxford days have not gone down at all well with a stoney-faced public. ‘Pig-gate’, last September, was only the most recent of these stories. But despite my sympathy for both pigs and foxes I feel myself irretrievably, even *umbilically* connected to the ‘Dave’ stories that have continued to drip from his time spent around Oxford’s dreaming spires. 

In the run-up to the 2010 election, a less sensational story of Dave’s glory days emerged, first via Channel Four and then from the Daily Mail. This was the story of Dave’s so-called ‘rasta friend’: Hugh ‘Andy’ Anderson, in whose Oxford bar Dave and his Bullingdon chums spent quality time during the 1980s. Accounts of this unlikely pairing, which detailed how ‘cool’ Cameron was, and how he could hold his own in any conversation about the endlessly complex polyrhythms of jazz and reggae music, were no doubt intended to paint ‘Call-Me-Dave’ in a friendlier, more inclusive light.  Maybe it even helped him win the election, who knows? But win it he did.

For me, though, there was a more personal impact. Because Hugh ‘Andy’ Anderson is my dad. Until this Daily Mail article, I’d never even seen a picture of my hitherto abstract father. I knew his name, my mother has told me that much. But until I was in my mid-teens, I didn’t even know he was Jamaican, let alone that he had friends in such high places. In those days there was still some shame attached to racially mixed relationships. My (welsh) mother was vague on the topic, citing my father’s origins as ’South American’ and refusing to be drawn further. But it was she who duly confirmed, looking somewhat paler than usual, that David Cameron’s colourful chum in the newspaper was, as a matter of fact, my father.

You will imagine this came as a shock; but things could have been worse. Imagine finding out that, say, George Osborne was your father. For me, that really would be horrific. And in the end it was fine, it was good, it was great, it was positive; thanks to the Daily Mail and, by logical extension, our Prime Minister, I was finally able to meet my male progenitor. And there aren’t many people who can make that particular claim. 

David Cameron has been outlining his plans to enrol muslim women on compulsory english language learning programmes or face deportation. There is also renewed talk of a pay threshold for non-EU migrants, where anyone earning less than £35k could be forcibly repatriated. Of course, none of this has any direct bearing on my dad, who makes a point of reading the Daily Mail even when he isn’t in it. Like so many immigrants, he had no money when he came to this country; he came from a remote rural community in Jamaica that to this day has little access to any of the opportunities or amenities that we take for granted; such as running water, transport links, healthcare or even free education. The community is depleted further by the continual drain of young people leaving to seek a better life. But despite this gradual disintegration, there remains a magma-core of strength and a religious certitude that would shake the Archbishop of Canterbury right down to his socks.  The name of that community? Peckham, Jamaica. And I do believe Del-Boy would approve. After all, he’s another self-made man.

Nowadays my father and I retain a more regular relationship with each other than he does with David Cameron – although should DC ever feel like dropping in again, he’d be made ever so welcome. Thirty years on, wealthy ex-public-school students still gravitate to my father’s bar; I could suggest that, for some, it’s just an exotic form of ’slumming it’ – but that would only make me sound cynical. 

By the way – Dave and the Daily Mail should know that my father is *not* in fact, a rasta. He smokes cuban cigars, not ganja; and he is a licensed bar-owner. Rastafarians do not consume alcohol. Or pork, come to that. Just saying.

Kitty-in-Boots: the secret papers

I wrote this, red-of-eye, the day after my own kitty-in-boots – Polly – went to the great storyteller in the sky, accompanied by the sounds of much heartfelt weeping and wailing. I dedicate this to her memory, and it was first published in The Conversation on January 29th, 2016.

It’s unlikely that, during her lifetime, Beatrix Potter ever imagined that she would one day be trending. But this week came the astonishing announcement that a wholly unknown manuscript has been discovered amongst Potter’s papers. Kitty-in-Boots will be published in September, and since Potter only completed one (entirely charming) watercolour of the titular Kitty, veteran illustrator Quentin Blake will supply the shortfall. All of which is rather magnificent.

Now – I liked Beatrix Potter books as a child, but I came to love them as an adult. In formative years I had several of the tiny, perfectly-formed books – The Tales of the Flopsy Bunnies, and Mrs Tiggywinkle, to name but two. These stories are cute, of course. But as an adult I made discoveries of some of her other, less twee, characters that had not formed part of my childhood experience. Such as Ginger and Pickles, a tomcat and a terrier – respectively – who decide to set up shop where everything is sold on endless credit. “But there is no money in what is called the ‘till’”. Inexplicably they go out of business, driven to despair on receipt of the “rates and taxes”. Then there is Mr Tod, an anally-retentive fox who almost-murders the filthy badger, Tommy Brock, and rhapsodises on all the different soaps that might get rid of the taint. “I could never sleep in that bed again without a spring cleaning of some sort”.

One of my enduring memories is driving through the Lake District – Potter country – when my son was small, listening to an audio CD of June Whitfield describing Mr Tod’s Kim-and-Aggie fantasies (“I must have a disinfecting. Perhaps I may have to burn sulphur”). I’m ashamed to admit that the enjoyment was far greater on my side. He was near-traumatised by Samuel Whiskers, the rat who tried to eat Tom Kitten in a giant sausage roll; but then my son, I believe, had unconsciously identified himself with Tom Kitten. Both preferred to be without clothing; for small boys and kittens, habitual nudity is more acceptable than it is for either adult humans or cats.

Like all the best children’s books, the beauty of the tales inhabits (at least) a double-layer of meaning, one for children, one for adults. I’m yet to read the whole of Kitty-in-Boots, but according to the publishers it’s a tale of ‘double identity’ and an encounter between Kitty and ‘the villainous fox Mr Tod’. The available snippet suggests it promises to be a longer, more textually-dense narrative than some of the simpler Potter stories, and with plenty of irony, humour and intrigue that should make it appealing to adults as well as children. And more than a few nods to the world of fairy tale.

Clearly, Potter was referencing the well-known fairy tale Puss in Boots, which originates – at least in literary form – in seventeenth century France, from the pen of Charles Perrault. This wouldn’t be the first time Potter’s works have drawn on folktale; some of her earliest illustrations are of American folk heroes Br’er Rabbit and Br’er Fox. However, the Br’er Rabbit stories known to Potter would not have been the oral traditional tales originating in African-American slave communities of the Deep South; rather they would have been the stories published by anglo-american folklorist Joel Chandler Harris from 1880 onwards, under the ‘Uncle Remus’ moniker.

There are, however, references to indigenous British folklore, with which Potter must have been familiar – either locally, via the oral tradition (although this was already significantly diminished by the late nineteenth century) or via fairy tale collections published by British folklorists such as Joseph Jacobs. For instance, the ‘sandy-whiskered gentleman’ who lures poor Jemima Puddleduck to a shed full of feathers: “it was almost suffocating; but it was comfortable and very soft”. This tongue-in-cheek reference is lost on many of us today – and certainly on a six-year-old version of me – but there is a very old folktale in which a man named Mr Fox lures women to his home in order to rob and murder them, keeping their gory remains in a bloody chamber. (Of course Jemima ought to have known this but, as Potter tells us, she “was a simpleton: not even the mention of sage and onion made her suspicious”.)

As for our new friend Kitty-in-Boots, there is, too, a British story tradition where a domestic cat leads a double-life. In The King of the Cats, the sexton and his wife describe an uncanny encounter at the graveyard with some pall-bearing cats, whereupon their own cat, who had been lying quietly by the fire, jumps up and shouts: “Well I’m the King of the Cats!” and disappears up the chimney. Interestingly, a version of this story features in what’s now designated as the first English novel: Beware the Cat, written by William Baldwin in 1553. Similarly Mr Fox was referenced by a line in Shakespeare’s Much Ado About Nothing, appearing later in the 16th century.

But not having seen any more of the tantalising Kitty, it’s impossible to conjecture any further. Like everyone else, I shall have to wait until September. Which seems an awfully long way away.

The Dress

This article was first published in the print edition of The Big Issue on January 4th, 2016.

I was almost wholly indifferent to the internet phenomenon that became known as ‘the dress’ early in 2015. Was the dress blue and black? Or white and gold? Who cared anyway? Not me – I saw it as blue and black, which, apparently, was correct, so I had no interest whatsoever in the fact that some (apparently colour-blind) people were seeing it as white and gold. Only when I revisited the dress recently for the purpose of this article did a very distressing thing happen to me: opening a picture of the dress on my iPad, it appeared white and gold; but as I scrolled down to the hem, it became blue and black. Then I found I could flip from one to the other – now black and blue, now white and gold. This was truly distressing, since I’d taken surreptitious pride in seeing the dress ‘as it actually was’ – only now it seemed to take pleasure in doing whatever the hell it liked. Mocking me, even.

From the small shiny rectangle of the screen in front of me, my eyes wandered up to the window; a wintry mid-afternoon in which rainy skies had briefly given way to blue. In the sky I could see two-thirds of the moon, ghostly by daylight, incongruous when not in its night-time habit. Incongruous to us, that is; the moon is there whether we see it or not, day and night. We see it more clearly by night, when it takes on a different sort of reality altogether. We always see the same side of the moon; it revolves not on its own axis, but in accordance with the earth’s axis, which is something I’d once demonstrated successfully using an apple and a satsuma. And at around this time each year, January 4th, the earth on its elliptical orbit moves closest to the sun – an annual event known as the perihelion. What does this have to do with the dress? Nothing; except that the fallout from ‘dressgate’ had us querying, to some degree, the way we perceive reality at all. Arguably the world in which we live is full of visual discrepancies that should make us all question the nature of our own personal realities – as well as whatever we might designate as objective reality – all the time. The very simple fact that the moon and sun appear – from our earthbound visual standpoint – to be precisely the same size, blows my mind. The dress doesn’t.

From this point in our calendar, we humble earthlings will now begin again to move farther from the sun and embark on an elliptical journey into 2016 which will, no doubt, be a continuation of the madness that has become our reality; closing 2015 to a chorus of Daesh, drones and Donald Trump. Collectively, individually, we worry about the state of the world; but don’t know what to do about it. Everybody has an opinion, although opinions are swept and moulded by the vagaries of social media; it moves public opinion like a monstrous child shifting sand around a sandpit – combining monomaniacal intent with the bliss of no discernible purpose. Meanwhile, the earth keeps travelling its orbit, the sun remains, and the moon shows one side of its face.

Perhaps that’s why the dress became a global phenomenon. In a world where we’ve become accustomed to having our opinions pre-chewed and pre-digested – and in some instances presented to us as finely formed fecal matter – it comes as a shock to see our cherished, but ultimately worthless, opinions shifting before our eyes. Of course, somewhere in physical space and time there was an actual dress, quietly inhabiting its own truth, irrespective of the storm that raged beyond its tangible limits. But elsewhere its endlessly reproduced image became whatever it wanted to be – or whatever we wanted it to be. Black and blue, the colours of night; white and gold, the colours of day.

But perhaps this too served as a metaphor for the way that 2015 unfolded, revealing a world stage of sharply defined division, each seeing its way as the only way, its truth the only truth. As with the dress, the surface compels the truth. This was nowhere more dispiritingly evidenced than in the House of Commons last month when Hilary Benn’s speech was a triumph of rhetoric over reason. For all its tub-thumping appeal, it was conceptually void; a shimmer of surface, a mirage compelling its own truth. And as we march, seemingly inexorably, into a 2016 that is already foreshadowed with the detritus of conflict we might remember that black and blue are the colours we associate with violence; white and gold with glory. Is it one or the other, or neither – or both?

David Bowie, Alan Rickman and some thoughts on death and mourning

A version of this article was published in The Conversation on January 15th, 2016.

How lonely it is, to lose someone. And always, how unexpected. Strange that human beings have been on the planet for at least 200,000 years – and still death takes us by surprise. It’s an absence that we’ve never quite become accustomed to. The notion of our own deaths is a spectre we can never quite imagine, seeming a rather absurd joke. Meanwhile the deaths of those we love plunges us into black isolation, a dull bubble of grief that afflicts us like an illness.

The deaths this week of David Bowie and, just yesterday, Alan Rickman, have resulted in emotional outpourings of shock and collective grief that have filled more column inches and social media hashtags than all the rest of the week’s news put together. I’m sure, either would ever have anticipated. The loss of Bowie in particular, who courted headlines just a week ago with the release of his album Blackstar, has sparked something approximating a national state of mourning in the UK. Followed so closely by the death of Alan Rickman, there is a perception that British cultural life has suffered a major blow; the loss of these two iconic figures standing, in the collective psyche, for something cruel and unusual.

I confess myself not to have been a particular fan either of Bowie’s nor of Rickman’s work, and so the reaction is a phenomenon I’ve observed with the curious, but impartial, eye of an outsider. Although of their respective repertoires, the two works with which I am most familiar are, coincidentally, implicitly concerned with death: Bowie’s 1980 hit Ashes to Ashes and Rickman’s role in Anthony Mingella’s 1990 film Truly, Madly, Deeply.

I was around 15 years old when I first saw Truly, Madly, Deeply. Rickman plays a ghost who returns to help his former lover – played by Juliet Stevenson – cope with a grief so intense that her life has been rendered unbearable. As a teenager, with limited experience either of love or loss, the plot failed to register a substantial impact. It passed quickly into the recesses of unmemory.

But revisiting the film as an adult renders a different experience. Juliet Stevenson’s portrayal of the devastations of grief is unquestionably one of the finest, and starkest, ever committed to celluloid. And against the backdrop of what amounts to a public display of mourning, I’m reminded how at odds these public expressions of grief are with the private reality of mourning.

Because, collectively, we misunderstand the nature of death when we shake our heads and mutter over those who have left planet earth, as if this is a great tragedy for *them*. It is not. It is a tragedy for those who are left behind. It is, and always has been, the living who suffer – not the dead.

Alongside loneliness and guilt – all the things we wished we’d said, or hadn’t said – comes an unexpected adjunct: fear. It was C.S.Lewis who wrote famously that no-one had told him how much grief felt like fear. Little wonder, then, that so much of what scares us is linked to the perpetual spectre of death.

And – as with Truly, Madly, Deeply, where the longed for revenant played by Rickman soon becomes an uncomfortable guest – we are reminded to be careful what we wish for. Anyone who grieves will tell you that forefront in the psyche is the hopeless wish for the return of the one who is lost. That this desire is central to our collective psyche is evidenced by the defining creed of the past 2000 years: that of Jesus Christ, who died and returned to us three days later, the cross signifying his eternal resurrection.

But elsewhere on our cultural landscape, the spectre of the revenant is a less welcome visitor.

In British folklore, a memorable example of the unwelcome revenant comes in the form of the demon lover – a young man lost at sea and who returns to wreak his revenge on his former lover, whom he considers faithless for taking a new husband. In W.W. Jacobs’ short story ‘The Monkey’s Paw’, an elderly couple wish for the return of their dead son, only to be horrified when the ghostly revenant bangs on their front door. In Toni Morrison’s celebrated novel ‘Beloved’, a dead child returns to the living as a decidedly corporeal adult, and consumes everything in sight.

If cultural symptoms are anything to go by, it seems that, despite our increasing western secularism, our anxieties and ambivalences regarding death – the great leveller – remain. Privately, we grieve for those we’ve loved. Publicly we grieve for those we’ve never even met – an anaesthetic form of mourning that allows us to play on the surface of death and its rituals without having actually to feel it – either truly, madly, or deeply… but if it *is* resurrection we seek, then we can have it. Because celebrity creates its own ghosts; and through the art they produced we can court the revenants at the push of a button… again… and again… and again…

 

Bloodletting: Guy Fawkes and ‘The Crucible’

A version of this article was first published in The Conversation on November 4th, 2015.

Until November 7th, Bristol Old Vic is staging a major new production of Arthur Miller’s classic play The Crucible. The Crucible is a dramatisation of the Salem witch trials that occurred in New England towards the end of the Seventeenth Century. Directed by Tom ‘War Horse’ Morris, this solid production – with excellent performances from both Dean Lennox Kelly and Neve McIntosh, playing John and Elizabeth Proctor – coincides with the centenary of Miller’s birth on October 17th. First performed in January 1953, it’s long been considered a consummate expression of anti-McCarthyism – a thinly veiled allegory of the communist ‘witch-hunts’ that soured America during the post-war period.

And it’s fitting that at this time of year our thoughts should turn to witch-hunts.

In 1605 Guy Fawkes was discovered in a cellar at the Houses of Parliament, checking on the gunpowder kegs that had been left there ready to surprise the King, James 1. Although he wasn’t the ringleader, he was caught redhanded; and although torture was technically illegal at the time, King James provided special written authority to proceed with tortures ranging from the ‘most gentle’ to the ‘worst’. After days of undisclosed agony Fawkes confessed, leading to the pursuit and incidental death of Robert Catesby, the actual ringleader. Fawkes, along with some others, was sentenced to be hung, drawn and quartered; Fawkes was considered fortunate to break his neck during the hanging and so was dead prior to being drawn and quartered. But for 400 years afterwards, ordinary British folk have felt it incumbent upon themselves, once a year, to burn effigies of Fawkes on bonfires, with or without an attendant chorus of fireworks.

When I was a kid – back in the eighties – the ‘penny for the guy’ tradition was commonplace. For those who don’t know, this meant stuffing a set of old clothes with screwed up newspaper and rags until it resembled a human form. In the run-up to Bonfire Night, the ‘Guy’ would be wheeled about on a small home-made cart or other contraption by children asking for a ‘penny’. Finally, poor Guy would go up in flames. Again. And again. In recent years this practice seems almost entirely to have died out. It’s difficult to pinpoint the reason why. It would be easy to blame video games and social media for its loss of appeal, but the enduring popularity of Bonfire Night and, increasingly, Halloween suggests there is still an appetite for these traditions.

But Bonfire Night and Halloween are linked by more than just calendrical proximity. The Gunpowder Plot occurred against a background of religious persecution, suspected witchcraft, and plague. Protestant King James was highly preoccupied by both religion and witchcraft, and was the author of – amongst other writings – Daemonologie, which might be described as a treatise on witch-hunting. James’ personal writings frequently equated catholicism with satanism, and in some parts of the country effigies of the Pope and Satan have traditionally been burned along with the Guy. To put it into context, The Gunpowder Plot took place in 1605; Macbeth was written in 1606; the King James Bible appeared in 1611; and the famous Pendle witch trials in 1612. Guy Fawkes not only represented treason, but also catholicism and, by implication, devil-worship. It’s not difficult to perceive a link between Halloween’s outing of witches and warlocks and, a few days later, the ritual burning of satanic evil. Bonfire Night represented a purging of *all* society’s ills – a sort of mob bloodletting.

Which brings us back to The Crucible. The Crucible was written at the time of the Rosenberg trial – the couple notoriously executed for passing atomic secrets to the Soviets; colluding with the devil himself. It’s very easy to draw parallels between Guy Fawkes and the Rosenbergs. All were executed, in spectacular and public fashion, for the crime of treason. All have been somewhat cast as scapegoats. And all played out against a backdrop rich with the tapestries of witch-hunting, as fearful societies attempted to exorcise their figurative demons.

But within the play itself, there is nothing to suggest we are reading a critique of anti-communism. If we were to confine our understanding to the play itself, rather than the secondary commentaries that have tended to dominate our understanding, we would find that the true source of both the witchcraft *and* the accusations (and, by extension, the hangings) are pubescent and pre-pubescent girls. Led by Abigail Williams (played superbly in the BOV production by Rona Morison), these uncannily powerful girl-children hold the locus of control entirely. And uncomfortably, perhaps, all pivots on pubertal female sexual desire and revenge. What we are left with, in the end, is a play where the source of the ‘evil’ comes not from Big Government but from children: specifically, pubescent girls with more or less explicit sexual intent.

This, actually, renders the play far less digestible than a one-size fits all commentary on McCarthyism. That this has been so consistently sidestepped by commentators should give us room for pause, since the scapegoat mechanisms of witch-hunts are situated precisely in our reluctant inability to see ourselves, preferring instead to project that which makes us uncomfortable onto another, sacrificial victim.

Penny for the Guy, anyone?

Freud’s couch on the couch

A version of this article was first published in The Conversation on October 25th, 2015.

On a leafy road in NW London, in the middle of a rose-garden, nestles the Freud Museum – though the rose petals, mid-October, are tumbling. The house, at 20 Maresfield Gardens, is the proud bearer of two blue plaques that adorn its frontage like war-medals: one is for Sigmund Freud, and is one for his daughter Anna. Both lived at the house until their deaths in 1939 and 1982, respectively.

Inside the house, Freud’s study has been preserved intact; everything that he owned in Vienna – prior to fleeing nazi persecution in 1938 – was carefully imported and recreated so that he could continue his work unabated; his books, his archeological artefacts, his leather chair that so curiously and uncannily resembles a human form – and, of course, his famous couch.

For the next six weeks, however, that couch is covered by a blanket.

‘Every Piece of Dust on Freud’s Couch’ is an installation commissioned by the Freud Museum, running from October 8th to November. The artists Adam Broomberg and Oliver Chanarin have created an installation piece within Freud’s study. At one end of the room, a Telex Caramate 4000 noisily rotates a series of monochrome slides. At the other end a blanket covers Freud’s couch. The slides – highly magnified images of fibres – are the product of a forensic team’s investigation into the dust from the rug on Freud’s couch. The dust itself, a handout informs us, is largely keratin; in other words, skin. The blanket on the couch – which initially I had taken for something left there by accident, perhaps by one of the cleaners – is actually a tapestry: the woven image of a highly magnified fibre from that couch.

This is not the first piece of art to be generated from Freudian dust. In 1996 Cornelia Parker created a piece called ‘Exhaled Blanket’ – a slide projection of dust and fibres also collected from Freud’s couch. While I would never argue the case for novelty when it comes to art, nonetheless the academic-pedant in me cries out for a reference.

According to the artists, this ‘exercise in forensics aspires to the language of science and, like psychoanalysis, it attempts something contradictory; the objective study of subjectivity’. While I would argue that there is nothing contradictory about the aim stated, it is true that Freud’s great project was to legitimise psychoanalysis as a ‘science’. Viewed in this light the attempt to frame the physical traces of Freud’s sitters within a forensic, scientific context, has validity.

But so many questions are left unasked.

We are told is that the dust ‘may include traces’ of some of Freud’s most famous cases. But of course we don’t know whether the dust thus magnified is actually that of, say, Dora or the Wolf Man, or belongs to someone or something else entirely. We are not told whether the forensic team was able to discover anything beyond being able to identify any given fibre as ‘cushion’ or ‘feather’. And so the claim that the tapestry is ‘an abstracted portrait of one of its sitters’ does not quite ring true. Furthermore, although we are told that most of the dust on the couch is composed of skin, the magnified slide images are not skin: rather each slide is labelled as Cushion, Feather, Hair, Coat, Rug.

I could fairly be accused of pedantry for picking on such details. But if we are aspiring to the language of science, then detail is key. Minor inaccuracies such as stating that Freud’s final years were spent in London, grate. In fact Freud spent only 15 months in London, 12 of which were at the house in Maresfield Gardens where he died in September 1939, just weeks after the start of World War 2.

This pedant will further admit to disliking the tired trend that insists artworks be accompanied by explanatory – yet inadequate (and inaccurate) – prose. But in the context of Freudian psychoanalysis, the misfit is even more pronounced. Psychoanalysis seeks for meaning beneath the surface. Bloomberg and Chanarin’s work overlays meaning clumsily on the surface, obscuring what lies beneath. This ‘blanketing’ is made manifest – unconsciously, one must presume – by the gaudy tapestry overlaying the couch. Although one could make the argument that Freud himself – and certainly many of his followers – did precisely as the artists here have done: superimpose their own impenetrable and subjective meaning over their complex subjects like a dense and opaque blanket.

Alone in an upstairs room, the actual rug from Freud’s couch is still on display. Covered in dust, it somehow has the moribund air of a viewed corpse prior to a funeral. Perhaps the moreso knowing that the dust covering it is largely composed of human skin.

It is impossible to know to whom those human fragments belonged, just as if they were specks of ash, cremations. To know that Freud’s life and death in this house came so swiftly on the heels of war, and in the context of nazi persecution, adds a sobering dimension to the fragments of human skin on the couch. The shadow of the holocaust looms large.

This was not perhaps the meaning the artists intended, but it was the one I took away with me, into that street in NW London, amongst the dropping October rose petals; through memory, association, and a series of unconscious displacements.