SQA results: More A grades, but we don’t know why

13th August 2021, 12:00am
Sqa Results: More A Grades, But We Don’t Know Why

Share

SQA results: More A grades, but we don’t know why

https://www.tes.com/magazine/news/secondary/sqa-results-more-grades-we-dont-know-why

Before results day on Tuesday, a headteacher privately observed to me that - despite the fact that millions of pounds is being invested in education recovery - schools were likely to post a lot of “best-evers” in terms of attainment, based on the 2020-21 results.

Schools “root for their kids”, he said, but in some parts of Scotland, the system that replaced the exams had been implemented in a way that had led to “real unfairness” with “very vague and changing advice” interpreted differently in different schools.

Another headteacher writing for Tes Scotland online gave some examples: students had different numbers of opportunities to be awarded a particular grade - some schools had two assessment windows, some three and others more, he said. Meanwhile, in some schools, resits of a particular paper were allowed; in others, they weren’t.

Then there was, of course, the issue of question papers being shared online by students on social media platforms such as TikTok, as Tes Scotland revealed in May.

Ultimately, however, this year there was no “best-ever” to record nationally in terms of the national A-C pass rate, which was down on last year for National 5, Higher and Advanced Higher, although still up considerably on the last set of pre-pandemic results, from 2019.

But there was still a striking “best-ever”, with the proportion of A-grade passes hitting the highest level ever across those same three qualifications.

At Advanced Higher, more than half of entries received A grades this year - 51 per cent, up from 46.3 per cent in 2020 and 32 per cent in 2019. At Higher, the proportion of A grades went from 28.3 per cent in 2019 to 40 per cent in 2020 and 47.6 per cent this year.

The SQA said at its media briefing on Tuesday that the rise in A grades had been driven, to a large extent, by attainment in English and maths, given the high number of entries these subjects have.

The proportion of students receiving an A for Higher English this year increased by more than 10 percentage points on last year, hitting 42.2 per cent. In 2019, fewer than a quarter of students sitting Higher English received an A (23 per cent).

This prompted questions about whether the A grade now has “a credibility problem”. SQA chief executive Fiona Robertson insisted that students should have confidence in their results, and reeled off a few possible explanations for the rise: the modifications to assessment this year; the absence of external assessment; and flexibility in terms of how, and when, courses were assessed by teachers and lecturers.

But the truth is - just as the huge rises in the pass rate last year were never investigated beyond the initial “rapid review” carried out by Professor Mark Priestley (which, incidentally, recommended further research) - there appear to be no plans to dig deeper and understand why pupils do so much better when grading is put in the hands of schools and teachers.

Last December, assessment expert Professor Louise Hayward told Tes Scotland that there should have been a research project following up what happened in 2020 to investigate why roughly a quarter of teacher judgements were not in line with SQA expectations - its now notorious algorithm, of course, “adjusted” 26 per cent of teacher judgements, until the Scottish government bowed to pressure and scrapped it.

Professor Hayward said: “We don’t know and we won’t know, and we will continue to guess unless we go in and investigate.”

As we look ahead to the replacement of the SQA and a refresh of national qualifications, this lack of willingness to learn from the experience of assessing pupils in 2020 and 2021 is baffling.

It seems likely the Organisation for Economic Cooperation and Development (OECD) report due out at the end of this month, looking specifically at assessment and qualifications, will recommend a move away from “traditional” exams - the OECD review out in June already said as much.

Yet, in the two years that Scottish education has been forced to assess students differently, we still don’t know why the “best-evers” happened - and, among those who head up our system of assessment, there appears to be precious little appetite to find out.

Emma Seith is a reporter at Tes Scotland

This article originally appeared in the 13 August 2021 issue under the headline “Students hit their A game - but we’re none the wiser as to why”

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

topics in this article

Recent
Most read
Most shared