How to use data to spot pupil behaviour issues ahead of time

Amassing data is all well and good but it has to be used to make meaningful decisions – such as spotting potential behaviour issues before they boil over, as Nathan Burns explains
8th March 2022, 10:00am

Share

How to use data to spot pupil behaviour issues ahead of time

https://www.tes.com/magazine/leadership/data/how-use-data-spot-pupil-behaviour-issues-ahead-time
Behaviour, Data

Data is everywhere in schools now - from assessment and attendance to communication and rewards.

Yet the area where data is crucial for pastoral leaders is around behaviour, usually focussing on behaviour points (or consequence or demerit points, depending on your school).

Consequences are given out for anything from failure to complete homework, to low-level disruption of lessons and uniform failures.

Logging this data in SIMS and SISRA systems can provide crucial insights for heads of year, heads of faculty and SLT by allowing them, on a week-by-week basis, to monitor behaviour and hopefully spot any notable changes and act accordingly.

Avoid playing ‘whack-a-mole’

However, the problem with so much data on behaviour being gathered is it can quickly become unmanageable, meaning you have no real idea which students need the greatest support and so you simply play “whack-a-mole” - reacting to each week’s data rather than spotting trends and providing support before an issue unfolds.

As such it’s crucial to have a system to properly assess and understand the data you are gathering so you can intervene sooner when a student starts displaying behaviour issues more often than expected.

Sounds great - but how can you actually make this work?

In our school, I developed a system whereby the consequence data logged in SIMS that was then exported as an Excel report could identify the students that are picking up more consequence points than their average - rather than just week-on-week data.

For example, “Tom” may have recorded 20 behaviour points a fortnight ago, and then 12 in the previous week. This would appear, on the face of it, to be a superb drop, which requires a metaphorical pat on the back.

However, if both 20 and 12 behaviour points are significantly above the mean for Tom, then 12 is still concerning and staff members need to be aware of this to realise that something may still be amiss and react accordingly.

So, within our Excel sheets, we developed a formula that compares each week’s behaviour points for each student as compared to the mean for all previous weeks - rather than just giving staff each week’s data and expecting them to spot the pattern.

This is easier to do than you might think.

How to generate the data

You’ll need to start with an Excel sheet detailing students’ behaviour points for each previous week, which you’ll likely be able to just pull straight from SIMS or other data management systems (if you’re not sure where this is, a head of year or SLT member will likely know).

Once you have this data, insert two new columns at the end - one titled “average” and one titled “change”. In the first column, use the =AVE function, and drag across for all behaviour points, excluding the week just gone.

In the final column, calculate the change between the previous week and the mean (eg, A12-A13), and divide by the mean value in the previous column (ie, A13).

Once these calculations have been carried out, I filter from greatest to smallest and then copy across the top 15 students into a Word document, which staff can access through a link I share with them. Equally, it could be printed out or emailed to staff and tutors.

This then makes it easier for them to see where a student is picking up more behaviour points than we might expect, and talk to the student as part of their work to find out what is causing these issues.

Reading between the lines

As ever with data, you need to be switched on to understand what it means - and this sometimes means understanding that what appears to be a huge increase is perhaps not quite as bad as it seems.

For example, if a student records three behaviour points in a week compared with their mean value of 0.2, it may look like the student has had a terrible week - and it may be the start of an issue so you may want to note it.

However, within our context, our system disregards anything less than six behaviour points in the previous week, as anything up to six behaviour points is fairly easy to accumulate (eg, two missed homework tasks).

Of course, this level of disregard could vary in different contexts, especially considering the different numerical values put against consequences.

Equally, where behaviour as a cohort is worsening, you may need to increase the barrier - and vice versa where it is falling. To do this, you can delete rows within your data analysis sheet where students have less than six behaviour points.

All of the above may sound beyond your Excel knowledge. However, work with colleagues in your pastoral teams who are more confident with Excel, and after a few weeks of playing around with the data, you’ll know just how to manipulate it.

Staff training

If you do move to implement something like this, it is important to also make sure you explain to staff what the data means so they can act on it correctly.

I took a handful of tutor briefings to explain this, as well as a meeting with other heads of year, so that they were clear on how the data was calculated and what it was showing, too.

The staff quickly got to grips with the data and how to use it.

What has the impact been?

In the several months since we have had this up and running, the number of students receiving consequences did not fall, nor did the number of detention issues.

But the number of behaviour points for the cohort did fall significantly. Students would still be getting behaviour points, but because we were spotting issues that were bubbling up (based on the data), we were talking to students before significant disruptions occurred.

This means that for students who had the potential to become more problematic over time, if their behaviour worsened, we were able to stop this happening and tackle any issues they may have.

On many occasions, this occurred as students could be moved onto target cards and reports sooner, but equally, additional focus and scrutiny from a tutor meant that students focussed more on improving their behaviour.

Early and more directed conversations with students also allowed for problems to be highlighted sooner, such as issues with completing homework or friendships.

Overall, the key is to recognise that although your school may be gathering lots of data, that in itself is not the outcome. The trick is to make sure you are analysing it to inform your decision-making to derive the best possible outcomes for your school and your students.

Nathan Burns is an assistant progress and achievement leader for key stage 3, as well as a maths teacher

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared