Why we should mark maths problems like English essays

Traditional methods of marking in maths can miss students who are brilliant problem-solvers, says David Thomas, who suggests a new approach
8th August 2024, 8:00am
Math equation

Share

Why we should mark maths problems like English essays

https://www.tes.com/magazine/teaching-learning/secondary/why-we-should-mark-maths-problems-english-essays-comparative-judgement

This article was originally published on 25 February 2024

In discussions about marking workload, maths teachers usually seem to come off as the “winners”. How lucky we are to have it so easy. How great it must be to mark mock exams in hours, not days.

But do the same factors that make our marking easy actually make our assessments less valid?

Yes, it is easy for us to assess whether someone has got the right answer or applied a procedure correctly using traditional “tick and flick” approaches. But it’s much harder to look for whether someone is constructing a sound mathematical argument on a relatively open-ended problem.

That’s why, at Axiom Maths, we’ve decided to take a leaf out of our English teaching colleagues’ books and look at alternative approaches to marking in maths.

A new approach to maths marking

One method that has taken off in English is comparative judgement. This involves teachers comparing two pieces of work and deciding which one is better. By comparing lots of pairs of work you can assign a scaled score that reflects how good one piece of work is relative to the others.

The advantage of this approach is that it can’t be gamed and doesn’t reward tick-box exam technique. Teachers don’t choose the piece of writing that best matches the descriptor in the marking. They choose the piece that is the best.

We have worked with No More Marking, an online provider of comparative judgement software, to trial using this style of assessment in maths. We gave 1,700 students a pair of relatively open problems. These problems required limited content knowledge, but lots of high-quality reasoning. A pool of 140 teachers from participating schools then made thousands of judgements to allow us to score the work.

So, what did we learn from this?


Read more:


First, we found that there are students out there who are brilliant at maths but who aren’t succeeding on standard assessments.

In our research, we gave students a multiple-choice test with traditional-style maths questions, alongside the problem-solving questions. While most students got similar results on both, this wasn’t true for everybody. We found some children who did poorly in the traditional questions but who were brilliant problem-solvers.

I recently saw a Year 7 student who fitted this mould. When solving a problem that just depended on numbers and logic, she was flying - far ahead of the rest of the group. But when given a problem about shape, she didn’t know how to find the area of a rectangle. Her teacher told me later about how disrupted her primary education had been and how she’d missed so much school that she had huge knowledge gaps.

How many children like this must there be across the country, whose potential is getting missed by traditional assessment? Imagine if we found them and recognised their talent.

Second, we found that maths teachers do not all agree on what good problem-solving is. How, for example, should we weight the relative importance of the right answer against well-expressed reasoning? What if the reasoning is well-expressed but inaccurate?

In this research, we didn’t tell teachers what to value. We just told them to select the best piece of maths, and this led to some challenging decisions.

For example, consider this scenario. Child A does no “working out” on paper but gets the correct answer. Child B, on the other hand, gives well-organised working and writes an explanation but their answer is wrong, and their method doesn’t really tackle the question. Who is better?

Overall, we found that teachers ranked having some working highly, especially if it involved pictorial representation. This was true even if the reasoning behind that working was inaccurate.

However, teachers took a dim view of solutions consisting only of the correct answer.

So what does all of this mean for how we mark in maths?

We don’t claim to have perfected the use of comparative judgement for our subject. But we think it shows promise. We’ve all encountered that pupil’s work that didn’t tick all the rubric boxes yet exuded a clear understanding and innovative approach.

We want to celebrate those hidden gems, to value understanding and bold thinking as highly as procedural recall. Fresh thinking around marking could put renewed emphasis on reasoning, encouraging pupils to experiment, take risks and learn from mistakes.

And the best thing? Even with this new approach, maths still doesn’t take that long to mark.

David Thomas, CEO of Axiom Maths, is a former maths teacher, secondary headteacher and Department for Education adviser

For the latest research, pedagogy and practical classroom advice delivered directly to your inbox every week, sign up to our Teaching Essentials newsletter

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared