Thursday, 13 February 2014

Dayton's Discontents: Life After Peace in Bosnia

More than any other European nation, Bosnia has struggled to shake its association with war in the eyes of the western world.  Tell someone you’re off on a trip there and more often than not they’ll ask, with a look of surprise, “is it safe?”  Until last week, apart from the landmines scattered across Bosnia’s beautiful countryside, the only signs of the war that tore the country apart in the 1990s were bullet and shell marks peppering buildings in many towns and cities.  To the casual traveller, passing through, Bosnia is a country of culture, history, good food and stunning scenery and Sarajevo easily holds its own against any of the European capitals as a vibrant, cosmopolitan city.  But scratch a little deeper and you’ll find the divisions and disillusions that have flared into violent protests over the last few days.


It started in the north-eastern town of Tuzla last Wednesday.  Unemployed workers took to the streets to protest about their poor pensions and unpaid benefits in a town that was once an industrial hub.  By the weekend the protests had spread to Sarajevo, Zenica and Mostar, where government buildings were set on fire and protesters clashed with the police.  Bosnians are taking to the streets to object to the corruption, nepotism and petty squabbling of their politicians, who they hold responsible for Bosnia’s economic stagnation.

Official statistics put unemployment in Bosnia at 27.5%, but it is believed that up to 44% of Bosnians are out of work.  Youth unemployment is estimated at around 60%.  The Bosnian economy has never fully recovered from the war.  Privatisations of state firms have been rife with crooked deals.  Added to the fact that most of these companies had already racked up massive debts during the collapse of the Yugoslav economy in the 1980s is the sad truth that Bosnia no longer has any industry to speak of and no service economy to fall back on.

The war in Bosnia ended nineteen years ago, in December 1995, with the signing of the Dayton Peace Accords.  They left the country divided between the semi-autonomous Republika Srpska (RS) and the Bosnian-Croat Federation.  The government of Bosnia-Herzegovina is headed by a three member presidency - a Bosniak and a Croat from the Federation and a Serb from the RS – with the Chair rotating every eight months.  Though the idea of electing politicians based on ethnicity is troubling, in reality Bosnia’s politicians are distant, bloated elites barely distinguishable from one another, regardless of their nominative party.  Below the national government are layers of local government, divided into cantons, which in turn are subdivided into municipalities.  This multi-layered and unwieldy system leads to frequent political deadlock, such as that seen last summer over the failure to pass a law on issuing ID numbers.  The stalemate, which resulted in the death of a young baby unable to leave the country for vital medical treatment, sparked protests in Sarajevo, which died down after only a few days.  But the embers have been glowing ever since, and last week were ignited once more.

The aim of the Dayton Peace Accords was to end the fighting in a way that would be acceptable to all sides.  In many ways it is an impressive piece of diplomacy, creating a structure within which Bosnia could begin its recovery from the conflict free from inter-ethnic power struggles.  But, nearly twenty years on, Dayton has its critics.  Eric Gordy believes that Dayton created a “permanent,parasitic political class.”  Like naughty schoolchildren, Dayton left Bosnia under international guardianship, in the shape of the Office of the High Representative, an international envoy who is unaccountable to the Bosnian people.  Others have argued that Dayton left Bosnia’s political situation unstable and divided along ethnic lines.

Prior to Dayton, the most serious attempt at a peace deal was the Vance-Owen Peace Plan (VOPP), put forward in 1993, which proposed division of Bosnia into ten semi-autonomous cantons.  The VOPP was rejected by the three warring parties, who could not agree on the divisions.  By August 1995, when an intense bombing campaign by NATO finally forced the BSA to surrender, the situation on the ground had changed sufficiently to allow the parties to sign a peace deal without losing face.  The UN “safe areas” in Eastern Bosnia had fallen to the BSA, eliminating the troublesome Bosniak pockets and allowing virtual ethnic homogeneity in the east of the country, and the Bosniaks and Bosnian Croats had formed an uneasy alliance against the Bosnian Serbs.  Under these circumstances, peace was achieved and Bosnia, which had dominated news bulletins for three years, slipped from the public consciousness.

 “Look at us now, my God! Look at Vucko and cry.”
The world had forgotten about Bosnia.  It took three days for western news outlets to pick up on last week’s protests, and they have now disappeared once more from the headlines.  It would seem that as long as the Bosnians aren’t killing each other, the world doesn’t want to know.  In a somewhat jarring juxtaposition with the protests, British gold medal-winning ice dancers Jayne Torvill and Christopher Dean returned to Sarajevo today to celebrate the thirtieth anniversary of their success in the 1984 Winter Olympics.  It seems it is easier to remember the Bolero than to remember the plight of the Bosnian people.

There have been calls for both further engagement by the international community and Bosnian self-determination in equal measure.  Peace deals imposed by foreign diplomats which leave nations crippled, humiliated and disgruntled can be dangerous.  Yet Dayton cannot take all the blame and it is important to note that the protests are targeted against internal politicians, not the international community.  Discontent has been brewing in Bosnia for some time, held back only by fear of descent into the violence of the 1990s.  Bosnians need to take ownership of the problems in their country.  The protests, which have resulted in the resignation of the leaders of a number of cantons, are a start.  Far better than accepting the terms imposed by outsiders is initiating internal reforms; by the people, for the people.

Whilst some believe that the protests have united Bosnians from all ethnic backgrounds, the protests have yet to have much of an impact in the RS.  Some Bosnian Serbs are fearful that the movement will undermine the autonomy of the RS, which is not as dysfunctional as the Federation.  Although it seems impossible to talk of Bosnia without mentioning ethnic divides, there is a danger that the ethno-nationalists will try to hijack the growing national discourse for their own ends and steer it off course.  If real change is to be effected in Bosnia, this cannot be allowed to happen.


"Courts and police are protecting the gangs in power"
The protests need direction if they are to achieve anything.  The frustration is understandable, but mindless vandalism is not the answer.  There are signs that the protests are starting to organise, with a series of plenums organised across the country.  The ruling political class cannot be expected to tear down a system from which they reap such great financial rewards.  What the protesters need is someone to direct the anger and despair; leaders who are not part of the status quo, who can translate their frustrations into the reforms that are needed to stimulate the economy.  The early elections that have been called for will be pointless until someone emerges to take on this mantle.  When they do, Bosnia’s Spring may finally bloom.

Saturday, 18 January 2014

Marrying Mr Darcy: An Act of Feminism?

The conclusion of Jane Austen’s best-loved novel, Pride and Prejudice, recounts like the plot of an unimaginative average fairy tale – girl of modest means falls in love with and marries a rich and handsome man and lives happily ever after.  By today’s standards Elizabeth Bennet’s eventual achievement of finding herself a husband is not much of a triumph, yet she remains one of the most enduringly popular characters in literature.  Her story has captivated me many times over.  Is this because secretly all I long for is financial security as the wife of my very own Mr Darcy?  Or does Elizabeth hold a deeper attraction, as a character?  Is there any reasonable justification for my affection for this eighteenth century genteel heroine?

The first thing that we notice about Elizabeth is her wit – her “impertinence” as she calls it; or the “liveliness of [her] mind”, as Mr Darcy prefers.  Like many I sat down to read P.D. James’ Death Comes to Pemberley with glee – legitimate fanfiction, by a reputable author!  But the book left me disappointed and it took me a while to realise why.  It was because James had reduced Elizabeth to a monochromic docile wife; she has none of the sparkling wit that so endears her to us in the original novel.  Absent was her “lively, sportive manner” of talking to her husband described in Pride and Prejudice’s closing chapter.  It is a testament to the timelessness Austen’s writing that the vast majority of the dialogue in Andrew Davies’ 1995 BBC adaptation is lifted straight from the book and still has the power to make us laugh.

Elizabeth is always laughing.  She herself declares that she “dearly love[s] a laugh” but qualifies this with the addition that she hopes not to “ridicule what is wise and good”, only “follies and nonsense, whims and inconsistencies.”  This distinction sets her apart from the laughter of her younger sisters, Kitty and Lydia, and her mother.  Lydia is often described as laughing; after eloping with Wickham, she writes that she can “hardly help laughing myself at your surprise tomorrow morning, as soon as I am missed.”  Lydia has no sense of when laughing is appropriate, whereas Elizabeth has a keen awareness of this.

That Elizabeth flaunts social convention and expectations of women in some ways (“jumping over stiles and springing over puddles” on her way to see her sister Jane when she is taken ill, rambling in solitude in the parkland at Rosings) is tempered by her acute consciousness of decorum.  In response to her mother’s loud and vulgar pontificating on the subject and Jane and Bingley’s marriage at the Netherfield Ball, Elizabeth “blushed and blushed again with shame and vexation.”  Austen uses Elizabeth’s blushes throughout the novel to indicate a “social awareness that others lack.”[1]  Elizabeth knows when and where to appropriately direct her mischievous conversation.  Darcy, as evidenced by his conversations with Miss Bingley, though more reserved, is also capable of this.  It is this, alongside Eliza Bennet’s “fine eyes”, that first attracts him to her.  Though he believes her manners are “not those of the fashionable world” he is “caught by their easy playfulness.”  He, like us, does not care for the simpering pretentions of Miss Bingley, preferring instead intelligent exchanges with Elizabeth.

With her great eye for the “minutiae”[2], Austen’s novels are filled with gently mocking observations of the ridiculousness present in the society of her time, and through Elizabeth we hear Austen’s own voice.  Most of these characters are so pompous and oblivious that they do not realise they are being mocked, and so we share in the private joke with Elizabeth and Austen.  It is this balance of disregard for the good opinions of those she does not respect with her recognition of the impropriety of her family’s behaviour that identifies Elizabeth as the rational voice guiding us through the follies of Regency Society.

Elizabeth is not without faults, which perhaps accounts for why we love her more than Austen’s other heroines.  She is not the meek Fanny Price of Mansfield Park nor the unwaveringly sensible Elinor Dashwood of Sense and Sensibility, nor is she the sometimes irritatingly delusional eponymous Emma Woodhouse.  She certainly has her flaws – her hasty prejudices, to name one – but retains an intrinsic goodness and an all-important self-awareness, which is familiar to the modern reader.  Though she is quick to judge Darcy based on Wickham’s lies about their shared past, when she learns the truth she is happy to acknowledge her error.

There is one aspect of Elizabeth’s behaviour that divides those who debate her status as a feminist character.  Her determination to marry for love leads her to reject not one, but two proposals.  At first glance this seems vindication of Elizabeth’s position as a thoroughly modern woman – who amongst us could contemplate the unendurable agony of a marriage to the absurd Mr Collins?  Should we not celebrate her courage in turning down Mr Darcy, with all his wealth, because she does not like him?  Mr Darcy, as would be expected of a man of his social standing at the time, is certain of her acceptance, despite the insulting pride and arrogance of his proposal.  Elizabeth’s rebuttal is so strong and unusual for the time that we cannot help but cheer her on.

We join Elizabeth in her shock at her friend Charlotte Lucas’ acceptance of Mr Collins' offer of marriage; in her “distressing conviction that it was impossible for that friend to be tolerably happy in the lot she had chosen.”  We see later that Charlotte is embarrassed by the gaucheness of her husband, but she defends herself: “I am not a romantic you know.  I never was.  I ask only a comfortable home.”  Marrying well was one of the only options available to women at this time and Charlotte has submitted to pragmatic realism.  And thus we start to question whether Elizabeth is, in fact, acting selfishly in turning down two men who have the means to provide financial security for her and her sisters, whose fortunes are entailed away from them on their father’s death.

Lyme Park, standing in for Pemberley in  the
1995 BBC version of Pride and Prejudice
In Georgian England, these decisions were not merely a question of love versus money.  This “crudely reduce[s] the intricacies of human choice” for “surely the strategic and emotional are blended in all of us?”[3]  Indeed, we can never be sure that Elizabeth is a not a little serious when she tells Jane that her love for Mr Darcy dates from the moment she “first saw his beautiful grounds at Pemberley.”  It does appear that Elizabeth’s feelings have already begun to change on arrival at Pemberley, before Mr Darcy has made his unexpected entrance.  In the eighteenth century, the “notion that the way a man landscaped his grounds might give some indication of his moral and mental qualities” was not uncommon[4], and we may reasonably conclude that Elizabeth’s admiration of Pemberley’s extensive woods and natural beauty is due to her appreciation of Mr Darcy’s tastes, and not to her desire for his fortune.  Nevertheless she does muse, apparently with some regret, “of all this I might have been mistress.”

Set at the end of the eighteenth century, Pride and Prejudice’s characters would doubtless have been aware of Mary Wollstonecraft’s A Vindication of the Rights of Women, published in 1792.  Against this backdrop, we may judge Elizabeth harshly for succumbing to that which society expected of her and settling for life as a gentleman’s wife.  On the other hand, this is no marriage of convenience and we should not begrudge her a life of comfort, for it is not one chosen in haste.  In Georgian England, where marriages were primarily business arrangements, for the consolidation of assets and procreation of heirs, and where women lived entirely subservient to their husbands, hers, we are confident, will be a marriage based on love and, most importantly, respect.  Safe in the knowledge that Mr Darcy so admires the qualities in Lizzy that we do ourselves - so much so that he has been inspired to change his opinions and his manners - we are able to anticipate her future happiness.

Is Elizabeth Bennet a feminist, a realist, or simply a romantic?  It may well be that she is a mixture of all three in equal measure, set apart from her contemporary literary heroines by her passion, her flaws, her complexities and, ultimately, her humanity.  She is as real a woman to today’s readers as she was to her 1813 audience and will unquestionably remain so for many centuries to come.





[1] John Mullan, What Matters in Jane Austen? Twenty Crucial Puzzles Solved, (London: Bloomsbury, 2012), p.260.
[2] Ibid., p.5.
[3] Amanda Vickery, The Gentleman’s Daughter: Women’s Lives in Georgian England, (New Haven: Yale University Press, 1998), p.44.
[4] Tony Tanner’s notes on the 1972 Penguin edition of Pride and Prejudice, p.397.

Monday, 23 December 2013

They Can't Take That Away From Me

7 October, 1955.  Norman Granz’s all-star line-up, Jazz at the Philharmonic, plays Houston, Texas.  Amongst the performers on the bill are Ella Fitzgerald, Dizzy Gillespie, Oscar Peterson, Buddy Rich and Illinois Jacquet.  The Houston Music Hall is packed out with an audience of all ages and ethnicities, all sitting in together, going wild for the jazz stars.

In Houston, in 1955, this was unprecedented.  Despite the historic Brown vs Board of Education ruling the previous year, which overturned the farcical ‘separate but equal’ Plessy vs Ferguson ruling of 1896, which the southern states of the US used as justification for segregating blacks and whites in all areas of public life, Houston was still a divided city.  The concert took place nine years before the Civil Rights Act was signed into law by President Johnson, effectively ending legal segregation, and fourteen years before the Texas State Legislature enacted its own desegregation laws, in 1969.

Mixed race audiences had been enjoying jazz concerts together for some time in a few cities in the North, where segregation, although not legally sanctioned was in de facto operation in most areas of society.  The 1955 JATP gig is widely recognised as the first desegregated concert in Houston.  Granz, who saw the potential of jazz to advance the cause of civil rights, had hired the venue and knew he had such an attractive line-up that he could dictate his terms.  He insisted that the “White” and “Negro” signs be removed from the toilets.  Tickets were sold on a first-come-first-served basis, regardless of colour, and there were no advance sales, preventing anyone from block-booking seats and allocating them to whites only.  Some white audience members objected to being seated next to blacks – Granz gave them their money back and asked them to leave.

L-R: Ella Fitzgerald, her assistant Georgiana Henry,
Illinois Jaquet and Dizzy Gillespie at Houston Police Station.
The concert was an undisputable success, but the spectre of Jim Crow was not completely absent from the auditorium that night.  Undercover Vice Squad members managed to get backstage and raided Ella Fitzgerald’s dressing room, arresting several of the stars for gambling as they were playing a private craps game.  Granz paid a $10 bond and they were released from prison in time for the second show, with audiences none the wiser.  There were a number of reporters at the police station that night, and although it was humiliating for the stars, the Houston Police Department didn’t come off too well in the press that followed.

The audiences that major black jazz musicians attracted, from across the social spectrum, helped break down barriers between communities in the USA, from the 1920s onwards.  Music has often been ahead of its time when it comes to social change.  With its roots in the music of slaves brought from Africa to North America, jazz music was intrinsically entwined with the struggle for equality, but its popularity amongst a wide range of people brought integration to its audiences before many other areas of society.

American folk music also played a key role in the civil rights movement, as well as in anti-Vietnam War protests in the US.  In Eastern Europe, in Hungary, Czechoslovakia and Romania, traditional folk music was used as a form of defiance against the Soviet puppet governments.  It’s not always the obvious political songs and artists that spark the significant transformations.  Sometimes, the simple act of bringing a crowd together for a shared musical experience can be potent enough to push forward new social agendas.  Few jazz songs and pieces were explicitly about the civil rights movement.  It’s the grassroots music, the music in our souls, that moves us and lets us share a common bond with those around us, that is the most powerful.

So don’t despair about today’s pop musicians.  They may not appear to be singing protest songs, but their music is shaping the minds of a generation of young people.  With the advent of the internet and the availability of recording equipment, the music scene has changed radically in the last decade, but we should not doubt the power of music to unify, and to be ahead of its time.  The musicians who took to the stage in Houston, way back in 1955, came to play good music, not to change the world.  Yet good music is, whether it knows it or not, one of the first steps towards that goal.

Merry Christmas, folks!


Friday, 22 November 2013

Of Assassinations, Faith and Intrigue

“President Kennedy was shot by two CIA agents, one in the Texas School Book Depository and the other on the grassy knoll.  I believe that Jack Ruby was acting under the misinformation that the Mafia was behind the assassination and that he received help from Dallas police officers.  I believe that the CIA wanted Kennedy dead because they blamed him for not backing up the anti-Castro Cuban rebels.”


Thus concluded my ten page school project on the assassination of JFK.  For a while when I was fifteen I was more than a little obsessed with Kennedy’s death, reading and watching everything available and convincing myself that the whole thing was a massive conspiracy on the part of the US government.

On the fiftieth anniversary of Kennedy’s death, there has been a litany of articles, blogs and documentaries dissecting the unanswered questions about that fateful day in Dallas.  How many shots were fired?  Where were they fired from?  Who was the man on the grassy knoll?  Why was the open-topped presidential motorcade travelling below the prescribed speed?  Was Kennedy’s brain removed from his body before it was buried, and if so, where is it now?  Why were there frames removed from the Zapruder home movie?  What motivation would a nightclub owner with ties to the mob have to shoot Lee Harvey Oswald before he could stand trial?  Why did the 500-plus page Warren Commission report have no index?

As a historian, a lot of my research has, both willingly and grudgingly, involved debunking popular intrigue about historical events.  There is no evidence that Margaret Thatcher ordered the sinking of the retreating General Belgrano during the Falklands War.  The moon landing wasn’t faked.  And it would seem overwhelmingly likely that one troubled young man on the sixth floor of the Texas School Book Depository was responsible for the assassination of President John F. Kennedy.


The confluence of circumstances on 22 November 1963 make a compelling case for conspiracy – a young Democratic president, who had defused the Cuban Missile Crisis but had blundered in the Bay of Pigs, who was pushing forward Civil Rights legislation and cracking down on organised crime, who had a beautiful wife and a young family, had been gunned down in his prime.  Though he had his enemies, Kennedy was a popular President, at home and abroad, emblematic of a new era of hope in global politics, and to many it was impossible to believe that one man acting alone could end his life so suddenly.  How could one ordinary citizen with a mail-order rifle be responsible for killing the most powerful politician in the world?  Rather than accept this sobering fact, it has become easier, more comforting even, to imagine that it was all the work of a web of shadowy characters behind the scenes of power.  To accept this theory is to refute the unpredictable nature of the assassin’s bullet.

Human beings love conspiracies.  They fill the pages of airport thrillers, pack the television schedules and make fantastic films.  JFK’s assassination was itself given the Hollywood makeover by Oliver Stone in 1991, which only fanned the conspiracist flames.  We have a particular predilection to deny the existence of random acts of madness or, even more so, tragic accidents.  It’s the reason why so many people cling to the idea that Princess Diana’s death was orchestrated by the British Royal family.  When coupled with the legends that develop around them, bound up with the mysterious draw of ‘what might have been’, the deaths of popular figures come to be seen as momentous events which can only have happened for a reason.

The majority of humans love to embrace faith – “the substance of things hoped for, the evidence of things not seen” - whether it be in religion, conspiracy theories or in the unshakeable dependability of our heroes.  The alternative is to accept that there is no more to life than that which we see in front of us.  For me, it’s the existence of extra-terrestrial life.  I’m fairly convinced that aliens haven’t made contact with Earth (though open to persuasion!), but like Fox Mulder, I want to believe they exist out there somewhere, and that we aren’t alone in the universe.

We feed on stories like Snowden’s NSA leaks because we like to think that our governments are hiding things from us.  They must be, because there has to be a reason why our lives aren’t quite as good as we’d like them to be.  There has to be someone to blame; someone pulling the strings, keeping us in the dark.  Hey, if we’re living in the Matrix, that would explain why we don’t feel as fulfilled or happy or earn as much money as we’d like.  Unfortunately, this theory discounts another significant human trait – our inability to keep secrets.  As we have daily evidence, secrets leak.  To quote The West Wing’s CJ Cregg, “there is no group of people this large in the world that can keep a secret.  I find it comforting.  It's how I know for sure that the government isn't covering up aliens in New Mexico.”  Disappointingly, what may appear to be mysterious anomalies that could only be the result of a complex plot just waiting to be exposed by a plucky maverick can almost always be attributed to miscalculations, misinterpretation and, sometimes, simple, foolish mistakes.  Never underestimate the ability of your government to bungle.
Even when the archives are opened and we have definitive answers about the Kennedy assassination, there are people who won’t believe them.  It’s harder than you’d think to shake a person’s faith.  And what about me?  Have I outgrown my teenage notions of CIA cover-ups and double agents?  The rational part of my brain would like to think so, but there’s a bit of my fifteen-year-old self inside who still clings onto the hope that the conspiracy will one day be revealed.

Saturday, 9 November 2013

Bridging the Divide

By early afternoon the hordes were already gathering on the bridge.  One diver strutted back and forth along the parapet, flexing and posing, while another worked the crowd, taking BAM, Euros and even Kuna from the tourists who had flocked to Mostar on day trips from the Dalmatian coast. They hadn’t timed it right this time.  They'd waited too long and the crowds had become bored, dispersing off into the old town to search for coffee pots and ice cream, post cards and paintings.  Or perhaps they’d timed it perfectly, returning to the club house on the turret of the bridge with a wodge of notes.

Two hours later we return to the bridge and the diver is back.  This time he jumps, arms outstretched before pulling himself pencil thin to plunge feet first into the fast-flowing bright green waters of the River Neretva below.  Young men have been diving off the Stari Most for centuries to impress the local girls and during the summer months the divers of the Mostar Diving Club earn a fair trade from tourists who pay to photograph them making the iconic 22-metre jump.

Twenty years ago today, on 9 November 1993, the 427-year-old Stari Most was destroyed by Bosnian Croat forces during the Bosnian war.  Its destruction served no strategic military purpose – there were other modern bridges across the river – but was instead an act of cultural vandalism.  The bridge was demolished for the Ottoman heritage it represented and the place it held in the hearts of the city’s inhabitants.

Footage of the bridge’s destruction was shown around the world and for some reason, as the shelling of Dubrovnik’s old town had done during the Croatian War in 1991, stirred feelings of outrage with westerners.  The war had been morally unambiguous when Bosnian Serbs, sponsored by Serbia, launched an offensive against the dream of an independent multi-ethnic Bosnia, but when fighting broke out between the Bosnian Croats and Bosniaks, it transformed the conflict into one far more complex, as western politicians never tired of telling reporters.  How could they possibly be expected to intervene in such a complicated civil war, where all sides were attacking one another?

Mostar served as the frontline between the Bosnian Croat forces (HVO) and the Bosnian government forces (ARBiH) during the early years of the war and was the scene of fierce fighting.  The fragile alliance between the HVO and ARBiH in the latter half of the war did nothing but paper thinly over the fissures in Mostar’s society.  

When the rebuilt bridge opened in July 2004 it was hoped that, in the ultimate act of symbolism, it would reunite a city that had been torn apart during the war.  Nine years on, the divisions are as evident as they ever were.  Mostar truly is a tale of two cities, divided by the Neretva, with the Bosniak population on the east bank and the Croats on the west.  The city duplicates all its municipal services – two central post offices, two water suppliers, even two fire services.  Infighting between Croat and Bosniak politicians brought the local government to a standstill earlier this year.  The western half of the city is noticeably more affluent, with Croatian flags flying and Croatian nationalist graffiti scrawled on the buildings.  Try finding a Sarajevska on this side of the river.

This uneasy coexistence is mirrored across Bosnia.  The recent debacle over ID numbers in the Parliament has only enflamed tensions between the Republika Srpska and the Bosnian Federation.  Bosnia-Herzegovina qualified for their first World Cup in October – it was celebrated in Sarajevo; less so in Banja Luka and Mostar.  What comes across strongest in eyewitness accounts of the war is a certainty that the majority of Bosnians were good, peace-loving people, that they all got along exceptionally well before the war, and that the fighting was perpetrated by outsiders.[1]  Yet a war was fought there and for many the memories are still raw.

Today the shell-marked bombed-out buildings interspersed amongst the renovated ones along the old frontline are a stark reminder of the fighting that devastated Mostar just twenty years ago.  The words “don’t forget” are painted on a stone in the old town so unobtrusively small that many miss it.  But it turns out that many of Mostar’s citizens have not forgotten.  And whilst the tourists queue up to take photos of the local divers, browse for trinkets in the cobbled streets of the old town and then get back on their coaches for a whistle stop tour of Medjugorje before heading back to Croatia, it is painfully obvious that it will take a lot more than rebuilding a bridge to bring many Bosnians back together.





[1] Svetlana Broz, Good People in an Evil Time: Portraits of Complicity and Resistance in the Bosnian War (New York: Other Press, 2004).

Sunday, 15 September 2013

Opening the school door and ending poverty

52% of the current cabinet were privately educated
Twice a week I run past the private school a few streets away from my house.  I peer through the fence in wide-eyed envy.  The autumn term has just begun and the summer’s immaculate cricket pitch and tennis courts have been replaced by pristine hockey and rugby pitches.  At a time when state schools are selling off their playing fields, is it any wonder that 37% of British medal-winners at the 2012 Olympics were privately educated?  Private schools educate just 7% of the British population but their former pupils make up 35% of MPs, 54% of leading journalists and 70% of judges.  96% of privately educated pupils go on to university, compared with 36% of those who attend state schools.  That percentage drops to 14% when it comes to children eligible for free school meals.

Is there anyone who truly believes this is because children whose parents cannot afford to send them to private school are inherently less intelligent?  The gulf between the academic achievements of children from disadvantaged backgrounds and those from wealthy families has not closed since the establishment of universal secondary education in 1944.  And this inequality extends beyond private schools.  The introduction of free schools and expansion of academies under the current government has generated yet another highly selective tier within British education.  A huge disparity in educational attainment exists between children whose parents live in affluent localities, who can afford to move between catchment areas to ensure their children attend the best state schools, and those for whom moving will never be a possibility.  Lack of social mobility for these families ensures that schools in the deprived areas of Britain are unable to lift themselves out of this perpetual cycle of poor performance.

Some very telling statistics:

  • 11-year-old pupils eligible for free school meals are around twice as likely not to achieve basic standards in literacy and numeracy as other 11-year-old pupils.
  • 1% (6,000) of pupils in England obtained no qualifications in 2009/10.
  • 7% (47,000) of pupils in England obtained fewer than 5 GCSEs in 2009/10.
  • 15% of boys and 10% of girls eligible for free school meals do not obtain 5 or more GCSEs.

(From http://www.poverty.org.uk/)

It is shocking that in a developed nation 16% of our adult population cannot read at the level expected of an 11-year-old.  A 2011 study revealed that 42% of the 566 employers interviewed were “not satisfied with the basic use of English by school and college leavers.”  The study also revealed that adults with poor literacy were least likely to be in full-time employment at the age of thirty and that “63% of men and 75% of women with very low literacy skills have never received a promotion.”

The Centre for Social Justice has warned of an ‘education underclass’ developing, reporting that some children from disadvantaged backgrounds start school at five “still in nappies, unable to speak or not even recognising their own name.”  The centre-right think-tank blames the parents for this, but it is depressingly likely that, failed by the education system, these children will leave school with poor - or no - qualifications, and go on to send their own children to school in a similar condition.  Taking a step back from the statistics, it doesn’t take a university-educated genius to recognise that equality of access to education is the key to breaking the poverty cycle.  Schools should be enormously expensive for the government and completely free for all children.  No one should be able to buy a better start in life.

Giving Britain’s children equal life chances is not simply a matter of providing every child with the same curriculum structure.  It is about making up the gap in more quantifiable resources.  The pupil premium, which was intended for this purpose, has had limited success.  Though I didn’t grow up in a particularly affluent household, my parents packed the house with books and placed a high value on learning.  Many children are not lucky enough have this kind of support at home.  With libraries closing down all over the country it is more important than ever that children are given access to books through school, above and beyond the textbooks essential for their lessons.  A recent study revealed that children who read for pleasure do better in other areas of education.  All children should be given the opportunity of free music lessons from an early age, which studies have proven have a positive effect on psychological development.  They should be given access to the arts, to the theatre and concerts, and to school trips abroad.  These may all sound like woolly liberal delusions, but that’s because the chances are that if you are reading this, you were given these opportunities as a child.

Britain’s schools have a crucial role to play in tackling the country’s obesity epidemic.  The introduction of free school meals across the board would remove the stigma associated with them and ensure that all children receive at least one healthy meal a day.  Free school meals for all pupils has been piloted in some of London’s poorest boroughs – and it seems to be working.  School cookery lessons should focus on how to cook healthy meals on a budget, children should be taught how to grow their own food and be given the opportunity to try many different sports, spending several hours on PE every week.

Even more important is the role which schools can play in helping children make the best decisions about their future.  Ofsted recently reported that “three quarters of schools visited by Ofsted were not delivering adequate careers advice.”  It is hard to imagine yourself doing something if it has never been presented to you as an option.  Even those state school students who get to university don’t do as well after graduating as their privately educated counterparts who are “far more able to draw upon family resources and had access to influential social networks to help get work experience or internships during their degree.”  Here there is a role for business and industry in education – in providing outreach to schools and work experience placements for students from all backgrounds.

In order to bring up standards in British education, it is time to start thinking about making drastic changes to the status quo.  Much has been made of the success of Finland’s education system, which ranked third in the world in the OECD’s 2009 Programme for International Student Assessment (PISA) and is “built upon values grounded in equity and equitable distribution of resources rather than on competition and choices.”[1]  Three of the most important features of the Finnish education system are a later school starting age, smaller class sizes, allowing education to be child-centred, and the freedom for teachers to try different styles of teaching within a standardised national framework.

Finnish children, like many others around the world, start school at the age of 7 and they do not take any formal national examinations until they reach secondary school.  Britain appears to be instigating the reverse, with formal education beginning whilst children are at nursery, the majority of children starting school at four and plans for the assessments of the three Rs that children currently take at 7 to be moved to the “early weeks of a child’s career at school”.  In a letter to the Daily Telegraph this week, a group of 127 academics, teachers, authors and charity leaders highlighted their concerns about the dominance of formal education during children’s formative years and emphasised the importance of “active, creative and outdoor play which is recognised by psychologists as vital for physical, social, emotional and cognitive development”.  Despite the success of this approach in many other countries, a spokesman for Michael Gove dismissed the concerns as “bogus pop psychology” belonging to a “powerful and badly misguided lobby who are responsible for the devaluation of exams and the culture of low expectations in state schools”.

Smaller class sizes are one of the chief benefits of a private education in this country.  Britain requires significant investment in teacher training to reduce class sizes so that education can be tailored to pupils’ individual needs.  Teachers should be given the freedom to explore new methods of teaching.  Yet Education Secretary Michael Gove remains rigidly opposed to creative approaches to teaching, favouring instead traditional methods of learning what he sees as core academic subjects.  He is woefully out of touch.  Technology has moved on and children need to be given the skills to use it to their advantage.  And you know what Michael?  Learning should be fun.  If it’s fun, the knowledge is more likely to be retained, and it is far more likely to pique a child’s further interest in a subject.

During World War II, with all the talk of universal education, private schools thought their death knell was sounding, but when the 1944 Education Act was passed it left them intact.  Education Minister George Tomlinson, explaining the decision not to abolish private schools, declared in 1947 that the Labour Party had “issued a statement of policy in which it looks forward to the day when the schools in the state system will be so good that nobody will want their children to go to a private school.”[2]  Nearly 70 years on we are nowhere near this goal.

Yes, these changes will be expensive and no, there will not be any immediate benefit to the taxpayers who will fund them.  There is, though, an economic argument here.  The state cannot afford to continue funding welfare for the number of British families living in poverty, and has a vested interest in improving their prospects so they can contribute to the state system through tax.  But more importantly, there is a fundamental moral imperative to ensure that all children have the same chances in life, regardless of their background.  In the sixth richest nation on Earth there are 3.5 million children living in poverty who are far less likely to do well at school than their more privileged peers.  Britain is at risk of becoming irrevocably divided along economic lines.  The only way to reverse this trend is through universal access to exceptional education.


[1] Pasi Sahlberg, Finnish Lessons: What can the world learn from educational change in Finland?, (New York: Teachers College Press, 2010), p.127.

[2] Ken Jones, Education in Britain: 1945 to the Present, (Cambridge: Polity, 2003), p.17. 

Monday, 29 July 2013

The Cold War is Dead - Long Live the Cold War!

Not once has Britain been directly threatened by a hostile nuclear power.  Earlier this year, when North Korea readied its warheads for launch, the targets were South Korea, Japan and, of course, the USA.  The Cuban Missile Crisis, widely acknowledged to be the moment the world came closest to nuclear destruction, was a confrontation between the Soviet Union and the United States.

From the end of World War II onwards, Britain’s global influence has steadily declined. The war left us with a severely damaged economy to rebuild, and the inclusion of Britain in the Yalta and Potsdam conferences was a tokenistic gesture in recognition of Britain’s wartime contribution.  But Churchill, and then Attlee, represented a nation with its glory days fading fast.  As the Empire slowly disintegrated over the next quarter century and the US and Soviet Union grew more powerful than Britain had ever been, even at the height of its influence, the UK was shunted into the wings of the global stage.

Far from being something to lament, as many nostalgic for an era when Britannia ruled the waves believe, it is something to be grateful for.  The number one nemesis of today's rogue nuclear states is the USA.  These states, either known to possess or suspected of possessing nuclear weaponry – Russia, China, North Korea, India, Pakistan, Israel and Iran – have little to no interest in Britain as a strategic target.  Though our politicians would have us believe otherwise, Britain is globally insignificant.  We are one of a number of European nations, a little fish in a big pond, and our ‘special’ relationship with the USA is best described as one of those somewhat awkward love affairs, where it is painfully obvious to observers that one party is just a little bit more into the whole thing than the other.  Whether we like it or not, this is how the rest of the world views us.  So maybe it is time embrace this realisation.

That Britain is a declining world power is just one of a long list of reasons why it makes sense, economically, politically and morally, to eliminate the UK’s nuclear capability.  Top of that list is indisputably that nuclear weapons are an ineffective and inappropriate response to the threats we face in a post-Cold War world.

Britain tested its first nuclear weapons in 1952
The Cold War officially ended with the dissolution of the USSR in 1991, but it had been drawing to a close throughout the 1980s.  President Ronald Reagan announced in his inaugural address in 1985 that his administration sought “the total elimination one day of nuclear weapons from the face of the Earth,” marking a definitive departure from the pro-nuclear policies that had prevailed in previous decades.  Soviet premier Mikhail Gorbachev’s twin policies of ‘perestroika’ and ‘glasnost’ (economic restructuring and openness) signalled a shift in the Russian position, as well as conceding the crippling effect that the nuclear arms race had had on the Soviet economy.  The thawing of the Cold War was confirmed by the INF Treaty, signed in 1987, and cemented by START I, signed 22 years ago this week by Gorbachev and President George Bush Snr.

When the bombs were dropped on Hiroshima and Nagasaki in 1945, it was predicted that any future war would result in nuclear holocaust.  This turned out to be erroneous.  The 20th century did not see a reduction in the number of wars being fought with conventional weaponry.  Indicators suggest the 21st century won’t either.  And alongside the conflicts erupting across the globe, in which many more combatants and civilians have been killed than lost their lives during both World Wars, we are facing new enemies against which nuclear weapons will be next to useless.  Terrorists who are prepared to martyr themselves for their cause will not be deterred by the threat of annihilation.  The possession of nuclear weapons by the major powers has not brought peace, nor has it made the world a safer place.

Yet their proliferation increases the risk that nuclear weapons will be used by accident.  There were many near misses during the Cold War.  During the Cuban Missile crisis, having lost communication with Moscow, the commander of a Soviet submarine, not knowing whether or not war had broken out, put it to a vote amongst his senior officers as to whether or not to launch his nuclear torpedo at a US ship blockading Cuban waters: the vote was 2:1 against firing.  There were numerous other incidents of confusion, misidentified flying objects and technical hitches which could have led to World War III.  It is perhaps only luck that prevented this.

Thousands in the UK marched in 1958 against nuclear weapons,
and the protests continued throughout the 1960s.
According to Gareth Evans, Co-Chair of the International Commission on Nuclear Non-Proliferation and Disarmament (ICNND), there are “at least 22,000 nuclear warheads still in existence, with a combined destructive capability of around 150,000 Hiroshima or Nagasaki sized bombs.” Why – morally and rationally – do we need this capability on the planet?  Experts believe that even a limited nuclear conflict – say, between India and Pakistan – would result in a nuclear winter, produced by ash from urban firestorms blocking out the sun and “causing devastating changes in weather patterns and rainfall.”

As for the role of Trident as a deterrent, the theory of Mutually Assured Destruction (commonly known as MAD) no longer stands up to scrutiny.  In 1967, US Secretary of Defence Robert McNamara, recognising that there could be no such thing as a winnable nuclear war, famously gave a speech in which he advocated the policy of acquiring so many nuclear weapons that it would, quite literally, be madness to start a nuclear war.  However, it is now recognised that MAD is no longer a viable in an unstable world with increasing numbers of nuclear adversaries.  In an article in the Wall Street Journal in 2007, three former US Secretaries of State, including Henry Kissinger, and a former chairman of the US Senate Armed Services Committee argued that "it is far from certain that we can successfully replicate the old Soviet-American ‘mutually assured destruction’ with an increasing number of potential nuclear enemies world-wide without dramatically increasing the risk that nuclear weapons will be used.”

If the moral and strategic arguments against replacing Trident aren’t convincing, then the economic ones surely must be.  £3 billion has already been spent on exploring new designs for the submarines.  It is estimated that the final cost could be up to £34 billion.  Maintenance of the fleet will cost around £100 billion over the next 30 years.  For that money, we could build the 100 best schools in the world.  Or, as John Prescott argued yesterday, plug the predicted £25 billion black hole in NHS funding by 2020.  In a time of austerity, we can ill afford to maintain a weapons system which exists only to act as a status symbol of a former glory.

Almost from the moment nuclear weapons were first tested politicians of all guises have committed themselves to disarmament, and yet we remain a nuclear world.  And, like children in the playground, if some countries have them, others will want them too.  Britain could set an example in becoming the first nation to fully disarm.  Unilateral disarmament could be the opportunity pro-nuclear fantasists dream of for the UK to become a world-leader again.