- Home
- GCSE mocks: have you made the most of the data?
GCSE mocks: have you made the most of the data?
If you teach in an 11-16 or 11-18 school (or a 16-18 FE college), then in all likelihood, not long before Christmas - or soon after we came back from Christmas (where did that time go!) - your exam groups will have sat a mock exam.
You will now have marked (or be marking) the papers, and have possibly entered the data in a question by question analysis sheet, and calculated grades.
So now what? Is that it? Is the usefulness for those mock exams over? Or is there more we can learn from them or do with them?
Quick read: How to cut the workload at mocks time
Quick listen: How to give pupils a deeper understanding of maths
Want to know more? Author and assessment expert Daisy Christodoulou on effective assessment
In my view, we can do a lot more. Here’s what we do in our maths department:
1. How to target weaker areas
In my department, we complete a question by question (QbyQ) analysis of the mock papers. This is not as time-consuming as it might appear (an entire set of three class papers can be done in little over an hour, once marked) and I do put meeting time aside for us to mark together - which serves as an excellent part of the moderation process. We then use that QbyQ analysis to target areas that many pupils have struggled with.
A word of warning with this. Daisy Christodolou, in her excellent book Making Good Progress: The Future of Assessment for Learning, explains the dangers of using summative assessments in a formative way, and the difficulties in using questions from these assessments to diagnose weaknesses.
This is particularly true of the larger 4, 5 and 6 (or more) mark questions, so we tend to ignore those completely.
Instead, we focus on the 1 or 2 mark questions, which are often just about one key skill. The AQA exams we use also have multiple-choice questions, so we pay particular attention to where pupils are not scoring well on those (although even some of those involve multiple skills).
We can then decide on actions based on that analysis - do we need to re teach, put on some intervention, or just give pupils the tools to go and deal with the issue independently (if we are sure they can).
2. Motivate action from the pupils
I am a big believer in the idea that, if pupils have all of this feedback at their disposal, they should definitely spend some time acting on it. One thing I get my classes to do is to look at each question they attempted and categorise them under one of four headings:
-
I scored full marks and understand why
These are questions the pupil is confident with and don’t need much (if anything) in the way of reflection. -
I scored full marks but don’t understand why
These are questions the pupil guessed at. This is always possible with exams that contain multiple-choice questions, and the temptation for a pupil is to go, “Yes, I got that right!” Pupils need to recognise when this was more about luck than judgement, because that luck is unlikely to be repeated in the final exam. -
I didn’t score full marks but I know why
These are questions the pupil has made a mistake on, but they know enough to recognise the mistake immediately upon reflection. This can be helped if you have signposted little errors whilst marking - for example, circling where a pupil has written “7 × 3 = 18”. Pupils need to recognise this mistake but also need to recognise that they probably still understand the concept that the question was talking about. -
I didn’t score full marks and I don’t know why
These are misconceptions or gaps - the pupil thought they knew enough maths to attempt at least part of the question, but along the way got into difficulty. They may know that they have gaps in their knowledge here or may have been oblivious to it. Either way, they need to tackle the problem.
Once pupils have categorised the questions, the focus is on understanding the concepts contained within the questions from categories 2 and 4. Much of this has to be done independently by the pupils, but we will supply the tools to support this and make time for pupils if they are struggling to make headway.
3. Use the same mock each year to predict outcomes and target pupils
This will depend on your mock exam practice. Many schools will use the latest set of papers for their mock exam, applying the grade boundaries given in the summer, and targeting those pupils that fall short of the score required to get the grades they need.
The possible problem with this practice is that (particularly in maths) papers change. GCSE maths exams only ever test a sample of the whole domain of knowledge we will have taught pupils, and because of the AO2 and AO3 requirements many of the topics in questions that are harder to access in one set of papers may become accessible in the next set of papers - or may be absent altogether.
In general, using performance on one set of papers to predict performance on a different set of papers can be problematic at best.
At my school, we take a different approach to mock exams: we use the same papers each year. While this does open us up to the possibility of pupils getting access to papers early (from older siblings or friends), we don’t explicitly tell pupils we are doing this so it doesn’t really come into it.
Our reprographics department is also very good at removing all identifying information from the papers, so pupils cannot go and look up answers online without a real effort.
What this allows us to do is match mock performance against real performance. Since the start of the new GCSE we now have two sets of data about how pupils scoring on our mock exam go on to achieve in the real exam, and every year we will be able to tweak that data to make it more accurate.
This allows us to predict with more confidence the outcomes for pupils based on the mock exam, and so tells us who is likely to perform below the grade they need. We can then decide how to tackle this - through group movement, intervention, or otherwise.
Of course, this will never be perfect, there will always be those pupils that do really well in the mock and then poorly on the final exam, and vice versa, but as time goes on these anomalies will be a smaller and smaller part of the data set, and the predictions will get better.
So, there we have three things that can be done with mock data - feel free to question or comment, or share your own strategies in the comments below.
Peter Mattock is a head of maths at an 11-16 school in Leicestershire and author of Visible Maths, due to be published in February 2019.
Further reading
- Teacher Omar Akhbar on the dangers of the messaging around GCSE mocks
- Could handwriting bias be impacting your pupils’ results?
- An English teacher on how he got a glut of grade 9s
Keep reading for just £1 per month
You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters