July has rolled around, which means that it’s almost the summer holidays - but not before primary schools have leapt that final hurdle of key stage 2 results. Well, that and sports days, cycling proficiency, end-of-year shows, open evenings, reports and parties. But, strangely enough, nobody’s job ever seems to be on the line after any of those. I suppose it would have to be a pretty awful performance of The Lion King for governors to decide that it might be time to look for a new headteacher.
Sats results, on the other hand, can cause great upheaval or upset if things don’t quite go to plan. Whether we like it or not - and regardless of ministers’ protestations about how we should all be downplaying Sats - rather a lot hangs on those few numbers. Seven years of schooling summed up in a few coloured boxes.
Variations in Sats results
Part of the problem is that we treat them like they’re the truth. After all, a test is a test, and who can argue with the outcomes? Except, in the past couple of years, even those in great authority have started to question that.
Now, if you look up the performance data for a junior school, you’ll find it comes with a government-approved caveat:
“We know from national data that pupils at junior schools, on average, have higher attainment scores at the end of key stage 2 than pupils at all other primary schools. However, on average, they also have lower progress scores. This may be for a variety of reasons…”
But once you admit to this vague “variety of reasons” for junior schools, where do things stop? Surely a variety of reasons explain variations in all sorts of contexts?
Misleading data
Middle schools also have a little caveat on their pages noting that middle schools, on average, have lower progress rates than primary schools when it comes to key stage 2 tests. But the National Middle Schools Forum also published data this year showing that during their time in middle schools, including their key stage 3 years, pupils make greater progress than in traditional primary and secondary schools. It rather seems that there is a “variety of reasons” why data doesn’t tell much of a picture.
If performance tables need caveats for these groups, then why stop there? Given Jo Saxton’s article last week about coastal schools, perhaps we can ask for an additional note to be added to schools within a mile or two of the sea?
“We know from national data that coastal schools, on average, have lower Progress 8 scores than their more affluent neighbours inland. This may be for a variety of reasons…”
Adding value
Things can cut both ways, of course. Last year we saw yet more evidence that, despite their various claims, grammar schools add no more value than non-selective schools when it comes to the most able pupils. I propose that we, therefore, add a new note to their performance data:
“Despite government rhetoric, grammar schools, on average, achieve higher Progress 8 scores solely based on their intake. This may be for a variety of reasons and so should not be taken as a reflection of the quality of education in the school.”
It’s amazing how many reasons there might be for data to be taken with a pinch of salt. Perhaps the best solution would be to simply add a caveat to all published results. Might I suggest:
“All published data about schools is imprecise and is likely to be a fairly unreliable indicator of the quality of provision in any given school. This may be for a variety of reasons…”
Michael Tidd is headteacher at Medmerry Primary School in West Sussex. He tweets as @MichaelT1979