- Home
- Leadership
- Data
- Revealed: How data cut teacher ‘leeway’ on GCSE grading
Revealed: How data cut teacher ‘leeway’ on GCSE grading
The “diversity” of the ways in which schools arrived at centre-assessed GCSE and A-level grades last year has been revealed by new Ofqual research, which shows that many relied on data-driven processes.
A survey of 54 teaching staff was undertaken prior to the government’s U-turn over CAGs, when schools anticipated some statistical moderation of their grades through Ofqual’s algorithm.
GCSEs: Pressure on teachers to limit grades generosity
GCSEs 2021: Teacher assessment risks bias, warns Ofqual
Ofqual: ‘We need to keep an eye on grade inflation’
Ofqual found that among most of those interviewed, “use of data was very important [in awarding CAGs], with a lot of descriptions of data files such as spreadsheets containing all of the available data compiled for each student”.
Sometimes this data came from senior management or a dedicated data analysis team in the centre, or sometimes this was compiled within departments, and sometimes by individual teacher/tutors.
“Sometimes these data files contained some more qualitative information in addition to test and work marks to help with decision-making,” says the report.
GCSEs and A levels: Standardised grades ‘imposed’
But the majority of interviewees described a “heavily designed process, coming from the senior centre leadership, and disseminated through departments” in order to “impose standardisation”.
Standardisation was largely imposed through either the sharing of centrally compiled data or through the sharing of detailed centre-devised guidance and plans, Ofqual found.
The shared data was “organised in such a way as to lead the departments towards the calculation of grades in a similar way”, while the shared guidance “detailed what departments were expected to do, but gave them a little more leeway over the choice and weighting of evidence to use in their decision-making”.
Avoiding ‘ridiculous’ grades
Schools took a range of approaches, with one school reducing the number of students awarded each grade by 15 per cent as a “starting point” for heads of departments, to avoid “ridiculous” results.
“We looked at two or three years of past data, we took averages. We looked at what kind of progress do pupils tend to make between mock exams, how reliable are predicted results, and so on, and we came up with a set of data,” one deputy head at an independent school said.
“So we then came up with a set of grades that we thought each department should be likely to get percentage-wise, given the cohorts that they’ve got...we then reduced those numbers. So we took down by 15 per cent the number of pupils achieving each of those grades. So if there had been 100 pupils getting an A at A-level maths, we put in 85 to get an A at A-level maths.
“And then we gave those numbers to the heads of department and said, ‘Here’s your starting point - use these numbers and we wouldn’t expect it to be worse than this but use this as your starting point.’ Because we knew that they would have lots of cases where they’d want to be generous, they’d want to err on the side of positivity, and that then gave us the flexibility to build back in so that we’d come up with a set of results that weren’t ridiculous, effectively.”
Some schools also calculated what a “reasonable” contextual value-added judgement would be for 2020 as a guide.
Another teacher reported that their school had used an average of the past three years’ grades in each subject to create a “distribution curve” as a guide for their 2020 results.
Ofqual added that “a more frequent approach was for more senior management to take the initial sets of CAGs and rank orders from departments and use previous years’ performance data to adjust them”.
Teachers did ‘best job’ at grading in circumstances
The interviewees reflected that “the hardest part was that the whole process was never going to be able to produce grades that perfectly matched performance in exams due to the predictive nature of the task”.
And they raised the point that when predicting grades, the “randomness” of future exam performance could not be entirely factored in.
Teachers also reflected on the difficulty of the process they faced.
One deputy head said: “A global disaster happened and I think we’ve come up with the best circumstances that we could in the situation. How much fallout we’re going to get, that’s the thing, that’s our abiding anxiety: we feel like we’ve done the best job we could, but we don’t know how much fallout there’s going to be from it.”
You need a Tes subscription to read this article
Subscribe now to read this article and get other subscriber-only content:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters
Already a subscriber? Log in
You need a subscription to read this article
Subscribe now to read this article and get other subscriber-only content, including:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters