Exam marking errors are just the tip of the iceberg

We know that exam grades are changed each year due to challenges but the problem of marking errors is likely much bigger than anyone realises, says Dennis Sherwood
3rd October 2024, 6:00am

Share

Exam marking errors are just the tip of the iceberg

https://www.tes.com/magazine/analysis/secondary/how-many-exam-marking-mistakes-gcse-and-a-level
Erasing ice berg

With over 6 million GCSE, AS- and A-level exam scripts to be marked each summer, some mistakes are bound to happen in marking.

Nonetheless, given the heightened concern around marking this year, it’s worth not just shrugging off re-marks as “one of those things” and instead looking at the reality of what happens each assessment cycle.

While measures of the quality of marking are not directly available, some inferences can perhaps be drawn from the statistics relating to marking errors, as published annually by Ofqual for the summer GCSE-, AS- and A-level exams in England (see the graphs below).

GCSE and A-level marking: Exam grade challenges and re-marks

 

As can be seen from the first column in both the GCSE and A-level graphs, about 20 per cent of all challenges result in grade changes attributable to marking errors.

The higher figure for GCSEs in 2017 is likely to be a consequence of the introduction of number grades - 9, 8, 7 and so on - for GCSE English, English literature and maths in that year, and the increase in 2022 is probably associated with the first year of exams after Covid.

Either way, the fact that one-fifth of grades change due to marking errors is notable.

Marking errors in GCSE and A-level exams

Not all marking errors result in a grade change, however, which is why, in the above graphs, the proportions of marking errors are larger than the proportions of grade changes.

Nonetheless, at GCSE a marking error is discovered and corrected in almost two-thirds of challenges, (once again, with an noticeable increase in 2017). At AS and A level, the proportions are higher, reaching almost 90 per cent in 2022; and the overall average of the years included is 75 per cent.

Of course, marking errors can only be discovered if a script is challenged. But as shown in the third column of the graphs above, around 90-95 per cent of scripts are never challenged.

How many of these contain marking errors that have not been identified simply because a challenge has not been made? And how many might have led to a grade change?

No one knows, for no one has looked.


More on marking and results:


In principle, every one of those unchallenged scripts might contain a marking error, but that is most unlikely. Equally unlikely, however, is that they are all error-free. So how many marking errors are there in the 6 million scripts marked each year?

The information available is based solely on the small number of scripts that have been challenged, and so the key question is: to what extent is the sample of appealed scripts representative of the whole population?

Fees, boundaries and fears

We should be wary of inferring that the challenges that are made are representative of the whole system - for two key reasons.

Firstly, most challenges will be made by candidates or centres that can afford to fund - and possibly forfeit - the fee of a challenge.

Secondly, most challenges to scripts will be by those just below a grade boundary - which makes sense because they have the chance of a better grade.

Of course, marking errors are not affected by either of these things at source. When scripts are marked no one knows the economic status of the candidates, nor does anyone know where the grade boundaries will be.

Furthermore, the likelihood of discovering a marking error in a script just below a grade boundary is about the same as that of discovering one in a script just above a grade boundary. No one is actively looking for these, but they are there nonetheless.

As such, I fear that the incidence of marking errors is much higher than any of us might like. The evidence currently available, however, is insufficient to draw firm conclusions. Perhaps some statisticians should take a look.

Dennis Sherwood is author of Missing the Mark: why so many school exam grades are wrong, and how to get results we can trust

Sign up to the Tes Daily banner

 

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

topics in this article

Recent
Most read
Most shared