Wednesday 5 December 2012

Can Defended Children Tell Right From Wrong?


‘...to defend a country you need an army. But to defend a civilisation you need schools. The single most important social institution is the place where we hand on our values to the next generation...’ Chief Rabbi Jonathan Sachs.

Arguably the most important things schools do is teach children the difference between right and wrong. At the core of this vital lesson which schools teach all day, all year, through their behaviour, sanctions, reward, and conduct policies, and through the behaviour and conduct of the staff and senior pupils, in the minutiae of what is merely ignored, what is tacitly tolerated, and what is noticed and dealt with.

The teaching of right and wrong sits on three vital pillars – actions, consequences and responsibility. When children do things which are wrong (actions), they may often need to be shown what consequences (good and bad) follow from these actions. Reward policies show how good actions lead to good consequences: hard work to achievement; kindness to good quality relationships; patience to satisfaction. But children also need to learn that bad actions lead to malign consequences – idleness to loss of opportunity; untruthfulness to a lack of trust; wilfulness to harm.

A key element of this process involves responsibility. It is in taking responsibility for a poor educational outcome that a child learns the value of hard work and practice; it is in taking responsibility for another child’s unhappiness that a child learns the malignancy of unkindness; it is in the acknowledgement of wilfulness that a child learns to stay within moral boundaries.

In other words, the key moral discovery children make in learning right-from-wrong is to mitigate their selfish desires with thought about consequences, and their responsibility for these consequences. And so, at the heart of the ‘right and wrong message’ is delayed gratification.

Increasingly, however, I am hearing from other headteachers that parents are seeking to excuse their children from the consequences, and therefore from all responsibility for their inappropriate behaviour. Parents seeking – lovingly, they believe – to shield their children from the result of misbehaviour are actually teaching their children that the action need not have a consequence, and that they can avoid responsibility for their actions. Once a child becomes accustomed to this, they may begin to learn the political behaviour which enables them so to manipulate their parents that they can rely on their parents to defend them, whatever the circumstances.

And who suffers from this? Of course it makes life more difficult for teachers and heads. In the long run, it will also make life more difficult for parents. But in the short, medium and long run, it damages the children. Jonathan Sachs again: ‘We know – it has been measured in many experiments – that children with strong impulse control grow to be better adjusted, more dependable, achieve higher grades in school and college and have more success in their careers than others. Success depends on the ability to delay gratification...’
If you are reading this, and you are a parent, please hold your child accountable for their actions. Do not necessarily believe your child(ren) as they seek to excuse themselves from moral responsibility. Help your child(ren)’s school to do the same. Support them as they do so.

If you are working in a school, as a teacher or headteacher, be reassured. The same battle is being fought in schools all over the world. It is a battle for values, for value, for civilisation. It is worth fighting.

Monday 19 November 2012

Ofqual's 'Breach of Contract' with the Young?


So, Ofqual has decided that next term’s January modules will be the last. I am pleased to see the back of them – and getting rid of them means that all schools who previously used them, ourselves included, will gain a week of teaching time. Fewer exams, more teaching – what’s to argue with?

Well, actually, two things. First, a cohort of students embarked on their A level studies in September (did Ofqual notice? Do they know that students generally study A levels for two years?) on the expectation that – possibly even the promise that – the courses would be structured in a certain way. At about the 1/5th of the way there, the goalposts have been subtly shifted.  Should the decision possibly have been made now for implementation post January 2014?

When we talk to students about the things that matter to them – going to concerts, sporting events and the like – we often say to them that they have to ask our permission to miss things before they book tickets. It doesn’t seem respectful to me when I am told that a pupil is not available for a school event because a subsequent engagement has already been booked. I tend to respond much better when I am asked before the booking.

I wonder therefore how students up and down the country have reacted to the news (have they realized yet? Do they look that far ahead?) that the ‘contract’ relating to the shape and timing of their A levels has been changed, without asking or consulting them, and after the courses had started.

And another thing… (you can always tell when someone is on their soapbox when these three words come in sequence)… has Ofqual sufficient confidence in the marking of A level modules to make university places dependent on only one go at both A2 (or all three A2) modules? Putting 50% of the marks for university in the terminal assessment does raise the bar somewhat on the requirement for accuracy. Our recent experience of the reliability of A2 assessment is not terribly encouraging. Will requirements for accuracy on exam boards rise?

The UK’s public exam system is free market. So, how about introducing the requirement for an exam board to pay a small fine every time a change in mark triggers a change in grade; and a bigger fine if the change in mark only changes the grade at the second or third stage of the appeal? Why don’t we increase the fine if the appeal has to be a priority case because a university place depends on it?

Why don’t we have a standard independent referral if the number of appeals on a certain qualification exceeds a specified percentage of entries? And why don’t we have a standard measure by which a qualification can be ‘de-licenced’ if, in the case of such an independent referral, the proportion of exams which are re-graded is significant.

Of course, if we are to ‘de-licence’ a qualification, we will have to give young people enough notice…

Saturday 3 November 2012

Remembrance - Balanced, Controversial?

If I am completely honest, there is something about Remembrance Day that I have always found difficult. If I had paid more attention to the news when I was growing up, I would have understood the stance that Robert Runcie, the then Archbishop of Canterbury took at the service to commemorate the end of the Falklands war in 1982. His determination to pray for the dead on both sides of that conflict caused an irreparable rift between him and Lady Thatcher, who had wanted the service to be triumphal. Given that he had won the Military Cross for Bravery forty years earlier, it was hard for the government then to criticise him.

Runcie’s underlying point – that there is loss, bravery, integrity, and heroism (and the lack of three of these) on both sides in a war – was not well understood at the time. It is possible that the way in which remembrance has been cast to school children during that time has contributed to a one-eyed view of conflict in this country.

How many young people in the UK know anything, for example, of The White Rose? This was a group of students, who – in 1942-3 – sought to do what they could to oppose the evil of the Third Reich. The core of the group was composed of just five students, two of whom were siblings - Sophie and Hans Scholl. All were in their very early twenties.

Members of The White Rose believed that their Christian beliefs meant they had to protest against what they saw happening in their country. As a result between June 1942 and February 1943 they prepared and distributed six different leaflets, in which they called for the active opposition of the German people to the Nazi movement and its policies.

In the first leaflet they wrote: ‘It is certain that today every honest German is ashamed of his government. Who among us has any conception of the dimensions of shame that will befall us and our children when one day the veil has fallen from our eyes and the most horrible of crimes ... reach the light of day?’

Understandably, the Third Reich was extremely concerned about the potential effects of these and the Gestapo were instructed to find the publishers.

On February 18 1943, the Scholls took a very large quantity of leaflets to their university. They dropped piles of these leaflets in the empty corridors for students to find when they poured out of lecture rooms. Before leaving, since they had some leaflets still in their suitcase, they returned to the university atrium and went up to the top floor, and there at the top Sophie threw the last remaining leaflets into the air. This was seen by a janitor, who called the police. Soon afterwards, they were taken into Gestapo custody. Then other active members were soon arrested, of whom six were convicted and executed.

One copy of their last leaflet was smuggled out of Germany, and it was edited by the allies and millions of copies were dropped from aeroplanes onto Germany.

Nearly six decades later, a German national TV competition chose "the ten greatest Germans of all time". The Scholl siblings – founders and leaders of The White Rose - came fourth, ahead of Bach, Goethe, and Albert Einstein. And readers of a widely circulated German magazine voted Sophie Scholl to be "the greatest woman of the twentieth century".

Why is this heroism not widely known outside Germany? Why, in our Remembrance Day activities, do we not think about the waste of life, the courage, the acts of selflessness on both sides of the conflicts that we remember?

Perhaps if young people were taught not just history-according-to-the-victors, perhaps if they were able to understand the effects of war on both sides of a conflict, perhaps if they could hear the accounts of those whose country was hostile to theirs, they could come to a balanced and thoughtful view of conflict. It’s possible that the internet is making this possible now in a way that it has not been in the past – when conflicts had to end before anybody could hear the views of the population of the country they had been fighting.

Perhaps, if remembrance in schools were to be less patriotic, and more bilateral, young people would grow up into adults who would go to war less often. As decision makers, at the polls or in the government, we might be more careful about entering armed conflict.

In our school, we will be thinking this week about loss, and sacrifice, and courage, and the cost of war – from the point of view of all participants.

Thursday 1 November 2012

(Early) Puberty Blues


Research recently referred to in the Guardian newspaper (http://www.guardian.co.uk/society/2012/oct/21/puberty-adolescence-childhood-onset) highlighted that girls are experiencing puberty eight years earlier – on average – than was the case 100 years ago. Some signs of puberty are evident in medical examinations of girls at an average age of less than 9 years, in some ethnic groups.

More is known about the speed and durability of the this trend than the reasons for it. But we do know that it is not limited to girls: further research suggests that boys are maturing physically fast than ever before, too. (See: http://www.guardian.co.uk/world/2012/oct/20/us-study-boys-puberty-earlier).

Potential causes range from diet to chemical pollution, and include consideration of social as well as biological causes. One research project, which unusually looked at the causes rather than the existence of these changes, found correlation between family instability, and particularly the absence of a father figure, and early puberty in girls (http://psycnet.apa.org/journals/dev/44/5/1409/). There is, potentially, an evolutionary explanation here: a herd of grazing animals might reasonably be expected to evolve an earlier arrival at adult capabilities by means of predators thinning out the genetic stock of those who are slow to mature – because the slow maturers would be vulnerable for longer. However, human development has not been marked by the need to survive the attentions of predators – at least in recent times!

There is clearly more work for researchers to do on the causes, but it is worth spending a few moments considering the consequences. One hundred years ago, a young woman had 12 years of cognizant evaluation, communication and assimilation between the age of 4 and 16 to come to terms with their place in the world, before they had to consider the consequences of the sexual maturity. Now girls have an average of 4-6 years, depending on their ethnic group, to do this – for some the time period has halved, for others it is now one third of the time their great grandmothers had. No wonder childhood and teenage mental health is poor.

Consider, too, the frequent assertion that young people are sexualised at a younger age. Is this a cause of these changes, or is it a consequence: do boys and girls become sexually aware of each other at a younger age because of the changes in their bodies rather than wholly because of cultural trends, as is often asserted?

Answers to these questions are likely to take some time to be formulated, tested, and refined. In the meanwhile, perhaps, parents should reflect that the fact that all is not as it was ‘when we were young’ is not entirely the fault of today’s young – who have a harder existence in many respects than their parents did.

Parents, and schools, should give more attention than ever to the fact that children – for that is what they are – are going to reach sexual capability, if not sexual maturity, at an age when culturally, and legally they are expected to be chaste. Enabling young people to navigate the years (for some a decade) between reaching an age of sexual capability and arriving at emotional maturity and legal majority is a major challenge for schools, and parents too, and one they will have to take up if young people are to be well prepared for young adulthood in today’s world. It is hard to see how the long interval this involves for today’s young people will be manageable without parents and schools unfashionably reinstating abstinence to the sex education programme.

Thursday 18 October 2012

Exploitation and Fair Trade in Media

On Tuesday evening this week, the editor of Private Eye, Ian Hislop, presented the last of his programmes on the ‘stiff upper lip’ which British people are famous for.

He reflected, in the course of that programme, on the edition of his magazine which had caused the most outrage. After Princess Diana’s death (never has the British stiff upper lip been less in evidence among the general public), Private Eye published a front cover which pointed out the hypocrisy with which the public acted: outrage at the circumstance of the Princess’ death was expressed by those who had been willing to pay the inflated prices of newspapers and magazines containing prurient pictures of celebrities. Those inflated magazine or newspaper prices had, of course, funded the photographic frenzy that pursues celebrities around the world through long lenses.

Privacy became a hot issue again this summer, when two members of the royal family had their privacy invaded in only a few weeks – first Prince Harry found that one of the friends he had invited back to his hotel room in Las Vegas had taken a picture of him naked, and sold it to the papers. Only shortly afterwards, his sister-in-law, The Duchess of Cambridge, was photographed sunbathing topless next to a swimming pool on a substantial private estate in France, by a man who had to stand on his car roof to see over the estate wall, and use a long lens. These photographs have been published by magazines in several countries and are apparently viewable online.

Privacy has never been so prized, and so valuable –the prices of houses with long drives and high garden walls are testament to that. It’s strange that, in the UK at least, we all want to live in houses which have privacy, and we all want privacy for ourselves, and yet we seem to have more than our fair share of the newspapers and magazines which make their money out of ignoring the rights to privacy of other people. We want our own privacy, but we want to acquire the private lives of other people too.

So how do young people reconcile the behaviour of the press, and their own magazine, or newspaper buying habits, with the desire they reasonably have for their own privacy. And does the ubiquity of the camera – embedded into every phone – mean that privacy of grief, of suffering, or of elation is no longer possible at any newsworthy event. Pictures are taken, beamed round the world, and viewed by millions. How many subjects of such pictures will rue it later? How will our pupils deal with the possibility of giving away their own privacy, as well as participating in business which forcibly takes it away from others?

And reality television presents another problem: how do we respond - how should we respond - when others give away their privacy, often in the hope of riches they cannot obtain by other means?

In an assembly this week, I suggested that the definition of fair trade should perhaps include those media which treat people respectfully. Perhaps we should encourage pupils to be as careful about buying magazines, or indeed newspapers, which buy long range pictures of famous people on holiday, as we do with products grown in a way that exploits people. Similarly, we should get young people to reflect on the nature of the relative poverty revealed by the determination of so many to make themselves rich via reality TV.

There isn't  after all, any difference in wrongness of exploitation – whether it’s a prince or princess, or at the other end of the range of privilege. In fact, it’s just possible that the worst exploitation of all is the one we are blind to. Whether that applies more to materially wealthy celebrities in magazines or 'wannabe's involved in musical talent shows is open to debate.

One thing is sure: if we don't teach young people what today's exploitation looks like - they won't know how to avoid it - as victims, consumers, or managers.

Thursday 11 October 2012

Lessons from Jimmy Savile

Last week it was revealed to widespread surprise and disappointment that the late Sir Jimmy Savile (DJ, TV presenter, celebrity and charity fundraiser) was the subject of multiple posthumous allegations of child abuse. However the most worrying revelation of all was surely the reporting, by Hugo Rifkind in The Times, that at least one such criminal offence had been detailed in Savile’s autobiography, published in 1974. (See http://is.gd/7JUhgT).

Rifkind’s conclusion, published without great commentary, is that Savile’s abuse of young women was not a secret, but it still went unremarked. He concludes: “I’m not sure what the lesson of all this is, but if there is one, it’s horribly bleak.” I suggest that there are, in fact, two conclusions – and one of them is indeed very bleak.

The first, and more cheerful, implication of the Savile-saga is that we are a great deal more sensitive to issues surrounding the violation of children’s rights by adults than we were.  In the 38 years since the publication of his book, a great deal has been achieved which has helped to protect children born since the 70s. Much progress has been made as a result of the persistence and enormous contribution of those who have lobbied government and established charities to support the victims of such abuse. Esther Rantzen is among the best known of those who have changed the common culture, so that what may have been unremarkable in 1974 would now be very remarkable indeed.

This does not mean, of course, that children are completely safe from violation by adults, not even in the UK. But it does represent a change in circumstances – the loss of a blindness – which is something to be celebrated.

There is, however, a second consequence. And it is bleak. One of the lessons here is that every generation has its blindness. That blindness is shockingly evident to succeeding generations. Sometimes it is risible. The fact that sensible people once thought that covering table legs was a way of preventing sexual passions from being inflamed is one example of successive generations being able to see the absurdity, and sometimes the hypocrisy, that was hidden at the time.

And so recent news begs this vital question: what is it that adults are currently doing to children, which in some way violates their rights, and is damaging to them – and which we are blind to? About which elements of our culture will commentators express incredulity that we failed to discern the damage to young lives to which we contributed, or at least failed to prevent?

Sitting in a provincial airport at the end of a conference last week, surrounded by fellow headteachers, I asked this question. We struggled to think of any which had the same potential for damage to young lives, but the consensus might surprise some readers. One head expressed it best when she suggested that the excessive supervision and control of young people by parents, who deny children independence and a normal scope of decision-making might be seen as risible by our children or grandchildren. It seems to me to be a very sensible starting point, but there may be other ideas which are equally worthy of consideration. 

Each answer may not, of course, be as important as continuing to ask the question.

Tuesday 2 October 2012

Exam Problems - Known For 100 Years.

Recent events have made many in the UK ask profound questions about the nature of our examining system. At the same time, I asked to provide clearer references for a book to which I have contributed a chapter, and looked up some material on Max Weber. I had read this about a year ago, refreshing my university memory, but that memory had slid away once more. My re-reading made me realise that the structure and nature of Britain's exams give rise to the risk of poor outcomes, and we've known this since Weber was writing, about 100 years ago.

John Keane's book Public Life and Late Capitalism refers (in chapter 2) to four elements of Weber's theory: first, bureaucracy is a structure in which actions are precisely executed by an objective matrix of power,  which emanates from the top. Trust in decisions made lower down is completely absent; all activity is structured. Second, tasks are rigidly divided, regulated (in writing), and compartmentalised. Third, activity is dehumanised, and there is no regard for particular people or individuals. Fourth, such calculating, mechanised organisation ensures its own 'relentless advance'.

At the same time as recoiling in horror from this concept of 'efficiency', the vision it conjured up was entirely redolent of my own experience of marking public exams, which I am happy to say is more than a decade out of date. I do, however, still have interaction with exam boards, and I continue to recognise the picture of the post-industrial revolution factory organisation brought to examining which reduces participants to cogs in a giant machine, which recognises no one as an individual, and ploughs on through complex situations with relentless and destructive simplicity.

Furthermore, the criticisms recently advanced by the HMC, ASCL (unions of UK headteachers) and others following the GCSE debacle seem to be pointing towards the exam boards behaving in a Weberian manner.

The move to point by point mark schemes which compel all students to adopt the same methodology and logical pace as the examiner (and penalises heavily the brilliant but impatient pupil who do not want to define terms as they go along) will be familiar to many. The rigidity of process, the purity of which trumps the justice of any injured party in an appeal, are recognisable to headteachers. But what can we do about it?

I suggest that, if the lessons of a comparison of our exam system to Weberian bureaucracy can be learned, it means four things:
1. Exam boards need to be liberated to set exams which do not necessarily produce answers so mechanical that 'tick box'marking can assess them. Such liberation will also allow able candidates to be stretched, especially at A level.
2. Examiners must be paid enough to recruit appropriately expert examiners, who can be trusted to give marks on the basis of their appreciation of the whole of a candidate's answer.
3. Young people should take fewer exams. The unwieldy and ravenously data hungry demand of the bureaucratic exam board solution will always say that it needs a little more data for exams to be really reliable, but the truth is that the country is drowning under exam scripts in at least two months of every year.
4. Exam boards cannot be profit making companies. If they are, they will continue to seek out the most efficient mechanisms for attributing marks to exams - not the most accurate, nor the most meaningful. Perhaps exam boards should be nationalised, under a supervisory board of teachers, universities and employers' associations.

Would these three things make exams error-free? No, but they might make the system a bit less monolithic, a bit less unnavigable, and a bit more accountable.

Wednesday 26 September 2012

Unhelpful Pressures... (2) Education Matters


What’s the matter with our education?

Materialism can be defined as the view that what you can see is all there is, and is all that is important. The physical, actual, tangible, visible world is always viewed as more important than the intangible, the philosophical, the thoughtful, the contemplative, the spiritual. What is outside our head is apparently more important than what is inside it. As such, the opposites of materialism are idealism and spiritualism.

The effect of materialism is that in recent times human development has been characterized by us being better at talking, thinking, and being imaginative about what is material than what is not. My generation has produced the mobile phone, the ipad, the plasma screen TV and the digital camera, all of which are useful, and nice both to look at and to use. But in conversation with me, a professor of engineering at Bath University recently contended that there have been no really important scientific discoveries since the invention of the silicon chip nearly 50 years ago.

Perhaps we should compare the development of the last 50 years with the extraordinary leap forward in art in renaissance Italy, or with the leap forward in music that took place in the lifetimes of Bach or Beethoven. I rather doubt there is anyone living who is going to make such an impact on music as either of those composers – certainly not Simon Cowell! In two hundred years, will lecturers in History of Art devote whole lecture courses to the developments Art in last fifty years?

Materialism makes human beings into animals that look with their eyes and listen with their ears, rather than doing both or either with their imagination. Materialism, is – I believe – making human beings fundamentally less creative. It is also, rather obviously, significantly related to the decline in the importance of religion in Western Europe, although not in the rest of the world, particularly the US, South America, Africa and the Far East.

I see three dangers from the preoccupation with all that is physically substantial: first, that our culture will become less sophisticated – in music, in visual arts, and in performance art too. I think that there will be less, in a generation or two, therefore, to make our spirits soar, less to inspire people. Second, materialism will continue to make our relationships shallower. Visitors to less materially affluent societies often observe on the better quality of relationships, and of time which such societies enjoy. Our western society sees its best expression of this at Christmas, where so many families give children presents which ensure they will play with them on their own – giving their parents time to ignore their children: the material replacing the ideal, or relational.

Thirdly, materialism will continue to make us narrowly evidential. Is it a step too far to suggest that it is partly the rise of the material that has made our exams increasingly mechanical? Since everything has been empiricised, and Andersen Consulting's maxim 'What gets measured gets managed' has been swallowed by government, today's exams demand that marks must be narrowly and provably accurate. Flights of philosophy are no longer valued. The efficient engineering of the manufacturing process must be brought to bear on the examining ‘industry’, like a modern day industrial revolution. The consequences are topical, and need no elaboration.

But materialism has had some positive consequences, too. Would we be as concerned for our planet if we hadn’t learned to be a bit more material in our considerations than our forebears? Would we be as scientifically active if we weren’t so curious about physical properties of space, and of atoms – even neutrinos.

Materialism isn’t all bad. I think it is mostly so, but not entirely. Perhaps our challenge is to spend a little time being immaterial, in a way which makes our spirits soar. This may be by looking at a piece of art, or listening to a piece of music, or spending some time contemplating the nature or existence of God. It's interesting that none of these, of course, will qualify students for the English Baccalaureate.

Wednesday 19 September 2012

Unhelpful Pressures on Education (1) - Consumerism

I buy therefore I am. Or even I consume therefore I am. As the car bumper sticker puts it, ‘He who has the most toys wins.’

At a time when shopping seems to have become our national sport and obsession – the investment in new shopping centres is even greater than the investment in new football stadiums, and possibly  greater even than our transitory investment in the Olympic movement – and at a time when our country has got itself in to difficulty because the size of our income has been no constraint on the amount we have sought to buy, consumerism seems to have become a universal way of thinking. Consumerism can scoop up even the most recalcitrant and influence us insidiously. A generation which spends money we don’t have, to buy things we don’t want , to impress people we don’t like cannot pretend that education has somehow escaped the consumerist plague.

The consumerism epidemic has three principal symptoms, even for those who resist it fiercely:
First, consumerism makes the most important question: do I like it? Not, is it beneficial? Enjoyment has become the most important, or certainly the dominant, consideration in our lives. The appraisal of lessons has become much more influenced by the question floating in the senior manager’s head (or in the inspectors) ‘Are the pupils enjoying themselves’. If education is training of the mind, why do we expect the hard yards of learning calculus to me made fun, in a way that an Olympic athlete’s first training session of the day, at 6 am,  outside, in November, never could be.

And so our world is high on entertainment, low on challenge, high on instant results, low on deferred gratification. Try reading that sentence again and inserting ‘our schools’ for ‘our world’, or even ‘DfE initiatives’. Even churches seek to entertain the congregations before trying to teach them. And we are all doing this because of consumerism, and we don’t even realize, day to day, that is happening.

Secondly, consumerism teaches us that if it isn’t right just throw it away and get a new one. A few years ago someone said to me that they didn’t want to buy a television which lasted for more than 3 years, because they found the whole experience of going out and buying a new television so exciting. The trouble with this is that it means that pupils think that if they are finding learning hard, they should just get a new teacher, managers think that if their workers aren’t delivering, they should just fire them and hire new ones, rather than training or helping them, and husbands and wives often think that if their marriage isn’t working, they should just get rid of their husband or wife and get a new one. Treating all things as disposable cons us into thinking that people are too. And they aren’t. A message of education should be that we can change people, otherwise, why bother teaching them. Isn’t teaching people changing them?

Thirdly, for a person in a consumerist society, their identity comes from their brands. People are defined by what they buy. They are a Jack Wills person, or Quiksilver, or Burberry. They are Apple, or Samsung, Mercedes or BMW, Bose or Beats. People long to get closer to the being the very epitome of their favourite brand.

There is no belonging because the brands turn over so fast that no one can ever rest, and of course it ensures that clothes can be thrown away long before they are worn out – which is the purpose of fashion. Belonging to a brand is no help in a crisis. Not like belonging to a village football club, or a Rotarian club, or a sailing club, or a book group.

During last summer my stepfather died, and my mother found that people from all the clubs and associations in their small seaside town were kind and helpful to her. Some from the allotment association, some from the Rotarians, and some from the sailing club. When that kind of thing happens to our generation, no one will call round from H&M, or Apple, or Mercedes Benz. Where will our belonging be? If we are defined only by our brands, where will our roots be in a crisis?

Even schools have become brands – as they try to create a sense of belonging in the young which can keep up by multinationals being advised by the best branding agencies in the world. Unsurprisingly, the advertising agencies, and the multinationals they advise, are winning. And a generation of young people are lovin’ it. It’s a race schools can’t win.

Consumerism poses educators serious problems, and some good questions. We need to address them head on.

Saturday 15 September 2012

Mindset versus Selection


A quick survey of websites of famous schools elicits a clear message that a certain level of ability is required for pupils to enter the school. This suggests that the ability to do well at school is, in some way, fixed by the time pupils reach that entry age; otherwise, why require it? Anecdotes of pupils being assessed for entry to nurseries in city centre schools suggests that this ability is regarded by some institutions as being fixed at an early age.

A review of the research and work of Professor Carol Dweck, now widely accepted as well-founded, could cause the impartial observer some puzzlement when considering the entry requirements of independent schools. Dweck suggested that people have either a fixed mindset or a growth mindset. The growth mindset – found to be correlated with long term success and resilience – is not correlated closely with academic achievement in the very young. Does academic selection, therefore, imply a fixed mindset on the part of the school - and therefore a likelihood of the school inculcating a fixed mindset in its pupils. (A fuller explanation may be found at: http://michaelgr.com/2007/04/15/fixed-mindset-vs-growth-mindset-which-one-are-you/).

Whilst it is surprising how little traction Dweck's theories have achieved in British independent school (how many - or should that be few - have created structures in which there is much greater reward for effort than achievement?) this doesn't mean that selection itself is wrong. The function of selection, however, in a school which has 'dwecked' it's curriculum is to make sure that the starting point of each pupil is at a level where the school’s curriculum will enable that child to experience at least some success. Although effort generates more progress over time than pure ability, both Dweck and Malcolm Gladwell (in ‘Outliers’) would concede that a child who starts by experiencing challenges that are always too difficult, and experiences no success at all, is likely to give up before Dweck’s benefits are achieved.

Nevertheless it is worth schools, and indeed parents, asking themselves if the normal ways the school rewards pupils and indeed communicates with them from day to day, stands up to the scrutiny which  Professor Dweck might bring to bear, in particular:
  • Do teachers use language which suggests achievement is the result of innate qualities or of hard work (Contrast 'Well done, you have to be very clever to manage that’ with 'Well done: it takes real perseverance to manage that’?)
  • Do pupils regularly hear the message that taking on challenges which are too difficult is a good thing, or is failure always failure?
  • Does the school give prizes to those who try hard, or those whose results put them ahead of their peers?
  • Do pupils take part in extra-curricular activities to learn skills, or to win, or pick up ‘baubles’ (including Duke of Edinburgh Awards)?
  • Are the school’s teachers committed to learning – do they have a growth mindset?

Tuesday 11 September 2012

The missing ‘ter’ – a danger to good education.


I am always interested by those independent school headteachers who have the ear of the media. It is good to understand what is felt to be of great importance at other schools. One thing which has fallen into this category in recent years has been the huge importance attached to ‘independent learning’. It’s a new and fashionable must-have.

The more I consider this, the more I believe it is, in part at least, a logical consequence of an insidious individualism in our culture. What I mean by that is that people are both viewed as individuals and valued as individuals. ‘We’ and ‘us’ become less important words than ‘I’ and ‘me’. One of the best examples of this is Apple, who have caught the mood of our times – the zeitgeist – more accurately than anyone else. That’s why it’s called the Ipad, the Ipod, the Iphone and the IMac. Even Apple’s cloud is called ME.com

It is an individualist view of the world – which emphasises the rights and importance of each individual to make their own decisions, for their own good only – which has helped to fuel the rise in the continual chatter about independent learning in schools. So parents nod sagely when headteachers talk about cultivating independent learning in their pupils. But is it a trend we should be worried about?

Like most dangerous untruths, this one is, I suggest, only slightly misguided, but it is misguided in important ways. Individualism has real problems: it makes us selfish; it makes us poor team players; it makes us unable to understand or value collective organisations. Note how, as the world has become more individualist, organisations which help people work together have struggled. That’s organisations as diverse as the Catholic Church, our local Amateur Dramatic Society and the Rotary Club are experiencing declining participation. We are, more and more, ‘bowling alone’, as Robert Putnam put it.

Of course young people should learn how to be independent, and how, where and when to use their independence. Interestingly, of course, they use it most obviously in school in their exams. At a time when if isn’t examined it isn’t valued, it’s hardly surprising that independence – learning to cope on your own in silence – is highly prized.

But our young people need also to learn where and when and how they should be, could be, might be dependent. Perhaps professional football would be better if players were more dependent on the referee. Independence of the referee doesn’t make sport work better.

And, more importantly still, where in their lives should the young exercise interdependence? So many of the activities excellent schools offer actually cultivate interdependent learning more than they do independent learning – team sport, ensemble music, drama, Duke of Edinburgh Award expeditions, Combined Cadet Forces, debating, Model United Nations, and so on.

The quality of our pupils – students – independent learning, and independent living, may gain them their first job, but it is surely the quality of their interdependent learning which will gain them their first promotion, and the most important too. What a shame only one A level, and that one which is not highly valued by universities, should seek to assess young people’s contribution to discussion, and their ability to learn in an interdependent way.

Inserting the three letters ‘ter’ may seem like a small change to make, but I suspect that if more schools focussed more on interdependent learning than they do on independent learning, the results might be more confident, more diverse, more engaging adults. Higher EQ, to go alongside high IQ.