Value added data still has margin of error

21st November 1997, 12:00am

Share

Value added data still has margin of error

https://www.tes.com/magazine/archive/value-added-data-still-has-margin-error
This could well be the last year of national league tables as a major public event, writes Nicholas Pyke. Interest is already waning. The volume of information has risen steeply and is set for a further radical increase.

While the Government backs league tables as a clear measure of accountability, it is also committed to printing them, from next year, in the form of a “value added” analysis - that is, schools will be judged according to their intakes and social circumstances. Which, as the Qualifications and Curriculum Authority has already admitted, will mean new league tables of considerable complexity. In fact it is far from clear that the value added data can be put into league tables of any description.

This year, for the first time, the Government has published a three-year comparison of schools’ performance. But it has found itself criticised in the same way as the previous, Conservative, administration. Teachers’ leaders and academics have combined to argue that this year’s GCSE tables remain unfair.

“Some data is easily better than no data. But it doesn’t mean that this is the best available,” said Professor Carol Fitz-Gibbon of Durham University.

Professor Harvey Goldstein from London University’s Institute of Education was more forthright. “This is a blunderbuss approach which does not do justice to the complexity of what goes on in schools,” he said.

“I think David Blunkett might have been badly advised. There’s no way you can construe these three-year improvement data as a move towards value added. I think the Government’s being slightly dishonest with people.”

Some of the problems with the latest four-year comparisons are straightforward. The first is that schools and local authorities with high performance levels find it hard to improve still further. An obvious, if perverse, example is the Scilly Isles which, with one school, comes at the top of the GCSE table but the bottom of the improvement table.

More basic is the complaint that GCSE results measure pupils’ background rather than a school’s performance. Statisticians have concluded that schools affect only a small percentage of their pupils’ performance. While pupil intake accounts for 50 per cent of the difference between students, school affects only between 10 and 15 per cent. And the latest studies suggest that, within this small proportion, primary schools are more important than secondary schools.

Also, the GCSE tables’ failure to take account of transient pupils or pupils with special educational needs can wholly alter the statistics.

Professor Fitz-Gibbon argues that schools should be allowed to leave a proportion of their pupils out of the statistics. “The exclusion rate has gone up dramatically since the performance tables started . . . If schools aren’t allowed to work imaginatively with these students, many schools will exclude them as soon as possible.”

According to many of the critics, the raw tables are not only inaccurate, they are damaging. Schools, says Professor Fitz-Gibbon, have been diverting resources towards those in the middle of the performance range - pupils capable of reaching C grades - so as to boost the five A*-C GCSEs rating.

While the proportion of students gaining five A*-Cs has risen over three years, the proportion achieving lower grades has seen no corresponding improvement.

Moreover, according to Professor Goldstein, there has probably been polarisation among schools as high-achieving institutions cream off even more of the available talent. It is notable, he says, that many of the top 100 schools on the latest index have a high degree of control over their admissions.

The Government’s problems will by no means end when it produces the value added analysis. The sort of simple indicators favoured by politicians and the public are not compatible with scientific accuracy. According to Professor Goldstein, it is only statistically possible to separate out the top 15 and the bottom 15 of every 100 schools. The rest are broadly equivalent.

“The only fair response is to have a richer information system,” said Professor Fitz-Gibbon. “The QCA supports that.” Whether it can persuade the public to take an interest is a different matter.

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared