Don’t drown in data

‘Measure more and more often’ has regrettably become the mantra of our education system, and teachers are now struggling with the workload generated by an over-reliance on data, says school data consultant James Pembroke. Here, he sets out his blueprint for how schools can transform their approach to data – by focusing only on the assessment that really makes a difference, they will ease the burden on their staff
13th April 2018, 12:00am
Magazine Article Image

Share

Don’t drown in data

https://www.tes.com/magazine/archived/dont-drown-data

Broken down, it sounds manageable: across the year, you will assess against 30 objectives for reading, 30 for writing and 30 for maths, with your 30 students assessed just once on each at the end of every half-term, using the descriptors “emerging”, “developing”, “secured”, or “mastered”.

That’s not too much work. That’s nothing compared with what you did in your last school. Easy.

But then you add them up.

That’s 16,200 assessments you are making in just 39 weeks. That’s the equivalent of 415 assessments in each of those weeks. That would amount to 83 assessments every single day. That’s not manageable. That’s completely ridiculous.

But in primary schools, where objective-level tracking is common, this is the norm. Some schools are much worse. Some have objectives for all subjects, and some have more than 30 objectives. Some schools I have visited track against as many as 80 objectives in a single subject each term.

And secondary is not much better. In fact, secondary can be even worse.

Let’s just stop a moment. We need to think this through.

In 2017, a survey commissioned by the Department for Education found that 75 per cent of ex-teachers cited workload as the reason they quit the profession. And a recent report by Unesco revealed that data collection is a huge part of the problem, with more than half of UK teachers saying that demands for data have caused them “unnecessary workload”.

The truth is, much of the data we collect is meaningless, and a significant proportion of it is a distraction from, not an aid to, teaching. Mindless data collection is making the profession intolerable.

How did we get into this mess?

And more importantly: how are we going to get out of it?

Let’s go back to 1988, when levels were introduced to England and Wales along with the national curriculum.

Levels were designed as a means of assessment that would measure pupils’ progress against a national framework. Attainment targets set out the standards that pupils were expected to reach in each subject, at each key stage, with clear milestones for progress and achievement.

In 2008, assessment became scaffolded even further by the introduction of assessing pupils’ progress (APP), which provided a series of assessment focuses to support teachers in making judgements about how students were progressing. Soon, we had not just levels, but sub-levels, with some schools requiring students to know their current sub-level in each subject by heart and be ready to report this to an Ofsted inspector, should they happen to ask for it.

Then, in 2014, everything changed. Under the coalition government, levels were scrapped. Speaking at the National College for Teaching and Leadership “Seizing Success” conference in June 2013, Michael Gove, then education secretary, explained the government’s decision, saying that the system encouraged teachers to “focus on a pupil’s current level, rather than consider more broadly what the pupil can actually do”.

Levels were ditched for sound reasons: they told us nothing about what pupils could or couldn’t do, they implied that progress was linear, and they encouraged pupils to be moved on to the next threshold rather than consolidating learning and deepening understanding.

Their removal should have been a watershed; the decision gave schools licence to develop more meaningful approaches to assessment that were geared towards teaching and learning.

However, there was a problem.

Another level

Instead of relying on levels, Gove suggested that schools would “introduce their own approaches to formative assessment, to support pupil attainment and progression”. “The assessment framework should be built into the school curriculum, so that schools can check what pupils have learned and whether they are on track to meet expectations at the end of the key stage, and so that they can report regularly to parents,” he said.

Ofsted inspections, he added, would be “informed by whatever pupil tracking data schools choose to keep”.

So what the government did, essentially, was take away a school’s language of communication when it came to student progress, demand they create a new language without any training to do so, and instruct them to speak that language to the people who held them accountable.

Was it really a surprise, then, that in July 2015, more than a year after levels were officially abolished, the DfE published research revealing that more than a third of schools were still using levels to measure progress? Or that many of the new systems adopted so closely resembled levels?

And, taking into consideration that schools had, since 1988, always been given a methodology of assessment and thus had no expertise in creating new assessment systems, was it any surprise that the assessment tools created were so often not fit for purpose?

What you see in many schools now is levels on steroids. Teachers are still collecting and entering vast amounts of data into tracking systems and spreadsheets; and senior leaders are still crunching it into tables, charts and graphs for various (usually external) audiences.

But “measure more and more often” has become the mantra, because schools are covering every angle that may be looked at, and don’t necessarily know what other schools are doing. “What does good assessment look like?” they ask. “It’s largely up to you,” comes the answer. So schools cover their bases.

Admittedly, Ofsted has tried to help. Its national director, Sean Harford, has made it clear that inspectors don’t need to see “vast amounts of data, spreadsheets, charts and graphs” and that inspectors will “judge the effectiveness of assessment and whether it is having an impact on pupils’ learning”.

The Ofsted handbook states that inspectors do not “expect performance and pupil-tracking information to be presented in a particular format”.

Similarly, the Commission on Assessment Without Levels states that “there is no point in collecting ‘data’ that provides no information about genuine learning”, while the Data Management Review Group encourages schools to “be ruthless: only collect what is needed to support outcomes for children” and “always ask why the data is needed”.

But none of this addresses the key reasons why data is out of control: fear of accountability, fear that your system lacks a crucial element needed, fear of what other schools are doing.

And that fear comes from a lack of knowledge. Through a desire to give schools autonomy, what has actually happened is that schools are left adrift to work things out without the required expertise or between-school, knowledge-sharing environment (owing to accountability and league tables) to do so. There is a lack of proper training, agreed definitions and modelling about what good assessment looks like beyond the basic pronouncements that there should not be much of it and it should support learning.

So what needs to change?

One option is to review the way that we hold schools accountable. This is something that a new commission launched by the NAHT headteachers’ union hopes to address. The independent commission is aimed at overhauling England’s “high-stakes, low-trust” accountability system.

It is due to report in September, but any recommendations it makes will take time to implement - if they are implemented at all. And with teachers leaving the profession in droves, we need a more urgent solution.

That leaves us with the second and more immediate option: for school leaders to start taking Ofsted at its word and to really focus on collecting only the data that has a demonstrable impact on outcomes.

The question, then, is: how do you do this? What would a blueprint look like for a new approach to gathering and using data in schools?

Truthfully, there is no simple answer to this question. But the good news is that there are schools already moving in the right direction. And by taking lessons from their practice, we can start to turn our data culture on its head - away from accountability and towards learning - with these three steps:

1. Gather less data

The first step in our blueprint is a simple one. The sheer amount of data that we are asking teachers to collect and process is contributing to unmanageable workload. And it isn’t necessary or effective. Just because you are gathering more, it doesn’t mean that you are gathering better.

But how do you reduce the amount of data?

“We should always be thinking about both the costs and benefits of collecting any data,” says Michael Tidd, headteacher at Medmerry Primary School in West Sussex. “The problems come when the person making the decision about what data to collect only deals with the apparent benefits. If you’re not the person having to come up with and enter the data then it’s easy to underestimate those costs. In reality, teachers’ time is probably one of the most valuable assets in the profession.

“Whenever we forget that, we end up losing good teachers.”

So talk to your teachers.

And to make sure that we are not wasting teachers’ valuable time, something as simple as reducing the number of data drops in a year can make a real difference. At Broadland High School in Norfolk, deputy headteacher Simon Laycock has done just that, by taking an already minimal system and reducing it further. “We only ever collected a grade each term, which was infrequent by many schools’ standards. This is now twice a year, following an assessment point - an exam in most subjects,” he explains.

By reducing the amount of data it collects, the school has cut workload and freed staff up to concentrate on their teaching.

This is something that Dan Rodeck, head of Filton Avenue Primary School in Bristol, has also focused on. He decided it was time to streamline his school’s data collection procedures after he conducted a staff survey on workload, and learned that the school’s assessment policy was causing unnecessary work.

He implemented a number of changes in line with the advice from Ofsted and the DfE on data and assessment, and now asks staff to focus not on the quantity of assessments but on the quality of feedback.

“Our data collection system has been stripped back,” Rodeck says. “Our new policy, based on evidence, places emphasis on immediacy and whole-class feedback.”

2. Only gather data that counts

As well as reducing the amount of data we collect, we also need to make sure that what we collect really counts for something. Again, this means building assessment from the classroom upwards, with input from teachers.

At The Thomas Hardye School in Dorchester, Dorset, staff have refocused their approach to concentrate on what assistant headteacher Tim Ennion calls “assessment that matters”.

“This effectively means continuous, high-impact assessment for learning,” says Ennion. “As a school, we don’t advocate spending hours ticking books, we don’t dictate summative assessment timelines or impose assessment policies on departments or collect masses of assessment data on our management information system. None of this has a significant impact on classroom practice, and we can’t justify the time spent doing it.”

The rule of thumb should be that if the data does not tell us anything new and cannot be acted upon, then we don’t really need to be gathering it.

Assessments against learning objectives can be useful for senior leaders and parents in that they reveal pupils’ security in key areas of the curriculum, but this needs to be kept to a minimum. It is better to focus instead on assessment that provides teachers with useful information about gaps in learning, such as question-level analysis from regular, low-stakes tests or from whole-class feedback.

Perhaps more importantly, we need to recognise the limitations of the data we do collect. Much of what is useful in teaching can’t be collected and analysed. Teachers will be constantly making nuanced assessments throughout lessons, but these interactions and observations cannot all be quantified, RAG-rated (red, amber, gold) and tracked.

Data is fluid and in a healthy tracking system, the numbers will go down as well as up. Sometimes children will go forwards, but sometimes they will go backwards, because they have forgotten what they have been taught or have had a bad day.

If we do not accept this, then our tracking system will never be a true reflection of learning. It will be less about where a student is really at and more about the impression that is being given - and this is of little use to anyone.

3. Connect assessment more closely to curriculum

You might notice that a pattern is now beginning to emerge in our blueprint - a pattern of data collection being used in the classroom to inform teaching, rather than as a tool for external reporting.

This is because, above all, any data that we gather should be about moving learning forward. To make sure that her staff do not lose sight of this, Gayle Fletcher, headteacher of Gloucester Road Primary School in Cheltenham, has intertwined the school’s data processes with the curriculum itself.

Building a curriculum that enables teachers to teach “fewer things in greater depth” and which allows for “a focus on key concepts, but also ensures repetition and consolidation throughout the year and key stage”, leads to more effective and efficient data gathering, Fletcher explains.

The curriculum should be designed in such a way that teachers are clear that they are “assessing the right things at the right time”, rather than just assessing for the sake of it, says Fletcher. Her school’s tracking system is used to identify gaps in knowledge that need to be plugged. It acts as a “notebook”, she explains, and is part of a wider system of minimal objectives, feedback and marking.

By ensuring that all assessment links to the curriculum, data is generated from assessment of what has been taught or, in some cases, what has not been taught. But it is done with purpose and is done with the sole intention of providing feedback about pupils’ progress through the curriculum.

If we get this right, then the data should correlate with the work in pupils’ books. On its own, data proves nothing, but if we start at the right end, assessing for learning and not for accountability, the link should be clear.

Beyond this, we simply need to rid ourselves of our delusions about the role of data. To do this, we must constantly ask questions about the information we gather: who is the data for? Why do they need it? Will it have any impact on learning? What impact will it have on workload?

If the answer to any of these questions is “I don’t know”, then we need to reconsider gathering that data at all.

And we need to be honest. To repeat the words of Michael Tidd: “Teachers’ time is probably one of the most valuable assets in the profession.”

We mustn’t waste it.


James Pembroke founded Sig+, an independent school data consultancy, after 10 years working with the Learning and Skills Council and local authorities. www.sigplus.co.uk

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared