As a nation we have just recently completed the yearly cycle of analysing our exam results for the year. School leaders, heads of department and data managers have been frantically aggregating, sorting, filtering, averaging, rounding-up and ‘residualising’ on Excel (or other vastly overpriced pieces of ‘analytical’ software). Once that’s done, the good, the bad and the ugly of the results have to be accounted for and explained. At this point things get interesting/stressful/obsessive/tiresome or downright irrational depending on your point of view.
Since reading Levitt and Dubner’s entertaining book Freakonomics, I’ve been suspicious of simplistic interpretations of the cause and effect of ‘data’. People quite naturally look for simple explanations and solutions to most situations, but it’s dangerous to make knee-jerk strategic decisions based on these simplistic interpretations.
I’ve often joked that I will ask my students to wear different coloured hats in my class each year—and then the year the results are to my liking, boldly claim ‘I made them wear lilac hats this year and look how good the results are! This should be a whole-school policy…’ In the last couple of years, I’ve heard of colleagues being told to email the school down the road to find out what they did to get such good English results, only for the roles to be reversed the year after…with no strategic changes occurring in either school!
For the last couple of years, I’ve taken an interest in cognitive biases. These are psychological effects that limit our ability to make rational judgements, and they appear to crop up regularly during decision-making situations. I decided to take a look at some of these biases in the context of school results analysis:
Anchoring effect – The tendency to rely too heavily the first piece of information acquired on that subject when making decisions:
‘The maths department got the best results this year, so we’d better ask them how we should teach our French lessons more effectively.’
Bandwagon effect – The tendency to do things because other many other people are doing it:
‘The other schools in the area are doing intervention sessions every day. We should probably do the same.’
‘Let’s sign up to PiXL.’
Clustering illusion – The tendency to overestimate the importance of streaks or clusters in large samples of random data:
‘Looking at our cohort of 85 A-level Biologists, the only 2 A* grades were from Miss Smith’s class. We should ask her how to get more A*s next year.’
Confirmation bias – The tendency to search for, interpret, and focus on information that confirms your previously existing beliefs:
‘The revision sessions I ran at Easter definitely improved the results. Look at Billy, Jenny and Alisha’s grades’ while ignoring the results of other students who turned up and the fact that those three students would, in all likelihood, have done well anyway.
Continued influence effect – The tendency to believe previously learned misinformation (falsehoods) even after it has been shown to be incorrect:
‘We need to collect data regularly throughout the year’ despite there being no evidence that it has any impact on outcomes and no expectation from external bodies.
Focussing effect – The tendency to place too much importance on one aspect of an event, and ignore wider information:
‘The timetable/teachers/curriculum/pastoral system/behaviour policy/books are to blame for the poor results this year.’
Framing effect – Drawing different conclusions from the same information, depending on how that information is presented:
‘This APS of 6.2 is excellent compared to last year.’ ‘Compared to other schools in the area, this APS of 6.2 is very disappointing.’
Illusory correlation – Inaccurately observing a relationship between two unrelated events:
‘Introducing mandatory teacher yoga sessions has really improved results this year.’
Illusory truth effect – A tendency to believe that something is true if it is easy to understand (or if it has been stated several times) regardless of its accuracy:
‘It is important to include VAK activities in lessons so all learners can make good progress.’
Law of instrument – An over-reliance on a familiar tool or methods, and ignoring alternative approaches:
Abraham Maslow said it best: ‘If all you have is a hammer, everything looks like a nail.’
Choose your favourite ‘hammer’: Data drops, book ‘scrutinies’, observations?
Mere exposure effect – The tendency to like things only because you are familiar with them:
‘Highlighting notes is a great way to learn things before an exam.’
‘PowerPoints are definitely the best way to communicate information to students.’
Parkinson’s law of triviality – The tendency to give disproportionate attention to trivial issues, while ignoring bigger issues:
I was going to give an example here, but my glue stick lids keep going missing!
What is worse, even when you’re aware of these cognitive biases, you’re hit with:
Bias blind spot – The tendency to see oneself as less biased than other people.
So, given schools are massively complex places and humans are prone to making all kinds of poor and illogical judgements, what should we do after looking at the results?
I would argue that avoiding knee-jerk changes is a start. Successful organisations implement carefully considered systems and stick with them. There are evidence-informed methods that help young people learn, and these haven’t changed much in decades. Tweaking these methods to ensure they are implemented in appropriate way for a given school context makes sense (perhaps incorporating technology to speed up these methods and also reduce teacher workload). Reviewing the curriculum from time-to-time to ensure it is relevant and carefully sequenced also seems valid. Ensuring the behaviour policies in the school allow teachers and students to feel safe and concentrate in lessons is an ongoing process.
However, it is unlikely this would involve a complete overhaul from one year to the next—and certainly not as a consequence of a single set of results. In conclusion, when analysing your results from the summer, beware your own (and your line-manager’s) cognitive biases when deciding on how to approach this year.
Dan Boorman is the Research Lead at Highworth Grammar School.