Seeing red over stats

Go beyond the superficial colour codes of RAISE data to find out what’s really going on in your school
20th January 2017, 12:00am
Magazine Article Image

Share

Seeing red over stats

https://www.tes.com/magazine/archived/seeing-red-over-stats

By now, every headteacher and senior leader in England has trawled through their RAISE reports and inspection dashboards, often opening them with the trepidation of someone approaching a suspect package.

What will we find inside? Shiny gifts or booby traps? Flicking through the pages, we look for coloured boxes that reveal statistical significance. Red indicates data that is significantly below average. Green, data that is significantly above average.

Data is then shared with staff, governors, the local authority adviser, the school improvement partner - maybe an Ofsted inspector. Everyone zeroes in on the little coloured boxes and almost everyone reaches the same conclusion: green means “school did good”; red means “school did bad”. Simple.

Except it’s not. An indicator of statistical significance simply identifies a deviation from the population average that probably didn’t happen by chance. It is therefore unusual and worthy of investigation. However, no cause can be inferred and no educational significance can be assumed.

No cause can be inferred and no educational significance can be assumed

If the national average score for throws of a dice is 3.5 and you get a significantly below average score of 1.5 after six throws, do we assume that the dice is faulty? Probably not.

The impact of random events

There are many reasons why results may deviate significantly from the average, and this does not necessarily indicate strong or weak teaching, or good or bad leadership. Numerous factors influence outcomes and many (such as deprivation, special educational needs, mobility, and language) are outside the school’s control. Sometimes, it’s just down to a random event - that bout of illness during exam week, or the tornado that removed the roof.

Those schools with green boxes can rest easy as far as their data is concerned, but for those schools seeing red it’s a different matter. My advice is twofold: 1) investigate progress at the pupil level; and 2) check CVA data in FFT.

Start with RAISE. Find the pupil list, open it and export to Excel. Look for pupils with very low negative scores in specific subjects, then go to RAISE data management, locate those pupils and delete them. Now run key reports from the list in RAISEonline and select “school’s own data” to view the amended results.

    Just removing one pupil can shift data from significantly below to in line with the average

Often, if the upper confidence limit is close to zero, just removing one pupil will shift data from significantly below to in line with the average.

Print this data out and staple it in your main report. Then use CVA to ensure that all those external influencing factors mentioned above are taken into account in your progress measures. But be warned: CVA doesn’t always do you a favour.

Above all, please remember: green does not necessarily mean good and red does not always mean bad.


James Pembroke founded Sig+, an independent school data consultancy, after 10 years working with the Learning and Skills Council and local authorities www.sigplus.co.uk

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared