What do MATs do with their exam results data?

On GCSE and A-level results days multi-academy trusts receive a huge amount of data. Ellen Peirson-Hagger asks leaders how they approach analysing it to drive improvements for the year ahead
9th August 2024, 6:00am

Share

What do MATs do with their exam results data?

https://www.tes.com/magazine/analysis/secondary/how-mats-use-gcse-a-level-exam-results-data-analysis
Number crunching: How trusts analyse GCSE and A-level results data

As GCSE and A-level results days approach, school and multi-academy trust leaders will be readying themselves for the onslaught of data they’re about to receive.

But with so much to look through, how do you make sense of what’s available and ensure that you’re making the most of it?

For Paul Tarn, CEO of Delta Academies Trust, speed is of the essence when it comes to analysing exam results data, which is why analysis at his trust begins almost the moment data is made available.

“We’ll download our data a couple of minutes past midnight on Wednesday morning, and then by the time we have our board meeting on Thursday, all of the analysis will be done,” Tarn tells Tes.

To help do this at speed Delta - which runs 57 schools in the Yorkshire and the Humber region - employs a full-time data analyst and a part-time assistant. Their work in those first moments will focus, in particular, on student results across a three-year trend.

“We look at where we were and where we are now. Are we up or down?” says Tarn.

Analysing GCSE and A-level results data

The trust uses a RAG (red, amber, green) system to compare its data to the national average and to flag up gaps in attainment between disadvantaged students and their peers in all schools.

Tarn explains: “The key measure for us is: what’s the gap for disadvantaged children, and have we managed to bridge that gap?”

The other key data that Tarn and his team focus on is their academies’ Progress 8 scores and the numbers of grade 4s and 5s in GCSE English and maths - “they’re the really important qualifications”, he says.

Getting this data together quickly means that the trust team has a set of clear insights to share in a meeting with the board of directors that takes place on exam results day.

Keeping things simple

The approach is similar at Cabot Learning Federation, where head of data Jason Bedingfield says he produces “analysis very quickly, so we can understand where we are in a timely manner”.

Steve Taylor, chief executive of the trust, which runs 34 academies in the South West, says that, in particular, insights on disadvantaged learners are made clear because this is a key priority for the MAT.

Like at Delta, “the board wants to know the performance of disadvantaged learners first”, Taylor explains.

And these insights are not just the preserve of those at the top, with the exam data that is produced being shared with staff across the trust.

“We deliberately keep things simple, and by doing analysis in-house, it means we have more control over what to use and how to showcase it,” Bedingfield explains.

The first data that is shared is subject-specific insight for each classroom teacher, sent in an Excel spreadsheet with “one page of information that concentrates on the class headlines: what’s the progress? What’s the attainment?”

It’s important to Bedingfield that the data is accessible, he says, so the one-page analysis is “colour-coded” and “printer-friendly”. “We don’t want them to have to log into a system and try to find their way around and work out what’s important.”

When the GCSE and A-level results have been processed, the trust makes the data available to all staff, says Kate Richardson, Cabot’s education director.

“We come together, face to face, the week before our schools go back,” she explains. “We share a high-level analysis of trust data and how schools have improved over time, and then that data is released to the academies” (via a shared drive).

The exam data is further used as a point of reference on Inset days later in the year, Richardson adds.

Owning the data

Making data accessible to staff is also important at Mossbourne Federation, which runs four schools in East London, says Peter Hughes, its CEO .

“For us, it’s about every person owning their data,” says Hughes, whose team includes a data manager and intelligence lead. “So every classroom teacher will analyse their data, every head of department, every line manager, every head of year.”

At Mossbourne teachers track data throughout the year using a platform on which they can see their class’s seating plan, attendance, exclusions, behaviour points and grades all together.

“I don’t want surprises on results day. I should know everything already”

Hughes says: “When you go to the next level up you compare class by class, across the whole subject. Then you could look right across the school.”

Exam data is just one component of this detailed data tracking, Hughes adds, and, if the rest of the system is working as it should, results day should not be a revelation.

“I don’t want surprises on results day. I should know everything already,” he says.

Lynsey Holzer, CEO of The Active Learning Trust, which runs 19 academies in Cambridgeshire and Suffolk, agrees.

“The results give us an indication of how accurately the schools are assessing, as there should be no surprises on results day,” she tells Tes.

“Throughout the year teacher assessment data is collected by the trust and analysed centrally by our data team and passed to the education team, a central body of improvement leads who work with all of our schools.”

This doesn’t mean that exam results data isn’t important, though. “Exam data for GCSE and A-level results is crucial in our understanding of performance in different schools,” Holzer says, because it shows “how well a school is performing in all subject areas and within the trust and national benchmarks”.

As such, after results days, the trust’s “senior leaders meet and discuss the results, by subject and by groups of pupils”.

Holzer adds that while the trust analyses results carefully, it is wary of using them to set targets for future years.

“We believe that targets in the traditional sense - when based on anything other than recent and accurate assessment - can promote underachievement. They are a blunt instrument.”

Reacting to a shock on results day

But what if there are surprises on results day?

Tarn explains how results data informs adjustments for the upcoming academic year at Delta, specifically regarding teacher deployment.

“Last year we had quite a shock,” he says. “One of our schools didn’t do as well as we thought they would in maths. What we discovered was that the whole year group had been timetabled at the same time, which meant that the deputy head, a brilliant maths teacher, only taught one group instead of two.”

Tarn wasn’t going to let that happen again. “So on the Wednesday of the first week back in school, we retimetabled,” he says.

Hughes had a similar experience at Mossbourne: he recalls one year when an A-level class had a “car crash” on results day with grades well below expectations in one subject.

When Hughes investigated, he found that there had been a change in the exam specifications that the subject teachers were not teaching to. And, looking at the year-round data, he realised that the class hadn’t been properly observed, so no one had picked up on the problem.

“Now we have a look at Year 11 and Year 13. We make sure someone [from the senior leadership team] has been in, has had a conversation, so we understand what’s happening inside that classroom,” he says.

But looking at data like this also enables you to spot positives - something that Taylor at Cabot calls his “bright spots”. If one teacher’s class performs better than expected in exams, he will observe that teacher’s lessons to see how they are doing it.

Taylor jokes that some teachers are put off by this approach: “Some say, ‘I see my reward for doing well is a visit from the CEO, rather than a box of chocolates!

“But generally the teachers who have had that kind of impact are professionally curious - they want the feedback.”

After observing, Taylor presents his findings at the all-trust conferences that Cabot holds twice a year. “We learn a lot from the best work,” he says.

Hughes says that exam results also lead him to observe lessons in other trusts.

“We look at the data from the schools that beat us, and visit them to figure out what they’re doing,” he explains.

“The biggest example we’ve spotted is that schools at the top of the table generally offer fewer subjects. So we’ve tightened our curriculum.”

Context is key

However, while data is key to how trusts and their schools measure progress and plan ahead, leaders are keen to emphasise that each school’s context must be taken into account whenever data is analysed.

For example, Tarn says that when analysing data at a trust-wide level, his RAG system includes information on free school meal (FSM) eligibility, “because that contextualises things”.

Furthermore, he points out, data on a primary school with “key stage 2 outcomes at 5 per cent above the national average” means more when you also know that the school has an FSM eligibility at “20 per cent above average”.

Context is also important to Hughes, who says it “is a way of understanding” results. You need to be “very careful” when comparing data across a trust because there is always more going on than the data suggests, he explains.

“Numbers point us in a direction,” he says. “It’s the conversations and the meetings that follow that are critical.”

Ellen Peirson-Hagger is senior writer at Tes

For the latest education news and analysis delivered directly to your inbox every weekday morning, sign up to the Tes Daily newsletter

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

topics in this article

Recent
Most read
Most shared