Backlash against flawed test data
Several newspapers used national test results released through the Convention of Scottish Local Authorities under freedom of information legislation to label councils and schools as the best and worst.
Inevitably, authorities such as East Renfrewshire emerge near the top while Glasgow comes bottom. But there are also wide disparities within authorities.
The figures show steady, but uneven, improvements over the last six years.
From 1999 to 2004, the proportion of S2 pupils achieving level E and above has risen from 45 per cent to 65 per cent in reading, from 38 per cent to 52 per cent in writing and from 42 per cent to 60 per cent in maths (complete results for 2005 are not yet available because seven councils have yet to supply figures). The HMIE yardstick is that 75 per cent of S2 pupils should have reached the level E standard.
But Michael O’Neill, North Lanarkshire’s director, this week hit back at what he believes is the misrepresentation and mishandling of information, while leading researchers and the local authorities’ education leader have called into question the reliability of 5-14 test results as a measure of school performance (page three).
Mr O’Neill told The TES Scotland: “Most media make the assumption that achieving level E by the end of S2 is something all people should be able to do, but it was never intended to be that way.
“It’s a good time for the Education Minister to look again at the way we report different standards of reading, writing and maths. Level E is not the basic level. We need to reflect as a nation on what we mean by basic standards, maybe using level D for a minimum competency.
“The target of 75 per cent achieving level E is an arbitrary figure and schools are pilloried if they have not achieved it. In international tables, we compare strongly in the same areas.”
Mr O’Neill also believes that it may be time to consider reintroducing what used to be called arithmetic for pupils at level C or below by the time they end primary and enter secondary. “They are spending time doing things that are not relevant to their level,” he said. “In terms of reading and writing, we know what we are doing. But are we not confusing maths work with arithmetic?”
What people wanted to know was whether youngsters could handle money if they worked in a shop and manage other essential numerical skills.
Mr O’Neill also said that a range of developments in schools were emphasising how out of date performance tables were. These included the introduction of Standard grade in S2 and the extension of vocational courses for pupils, which did not feature in the results.
“Schools that are sometimes the most successful in delivering vocational and alternative courses appear to be less successful because there is no equivalence in the (Scottish Qualifications Authority) framework,” he said.
Some pupils about whom the most concern was expressed were taking City and Guilds level 1 in with various trades outwith the SQA structure.
Some of the most vulnerable pupils were also achieving through other routes such as Skillforce and the Duke of Edinburgh’s Award Scheme.
Meanwhile, Lindsay Paterson, professor of education at Edinburgh University, described any bid to assemble performance tables based on national test results in S2 as “just nonsense”. These were simply formative tests that confirmed teachers’ judgments. “At least Highers are nationally standardised,” Professor Paterson said.
He also rejected value-added measures as a more accurate check on school performance, saying research in England had shown that to be discredited.
This concluded that well-motivated children brought to school higher levels of prior achievement and a capacity to learn effectively. “That is independent of what the secondary school gives them”, Professor Paterson argued.
He added: “It’s as much a given that there is no way of comparing value-added statistics as there is raw statistics. At least raw Higher results provide a simple meaningful statistic”.
Brian Boyd, education professor at Strathclyde University and a member of Glasgow’s newly established education commission, said that it was “incredibly dispiriting” for schools in difficult areas which found it almost impossible to leapfrog schools in higher socio-economic areas.
Professor Boyd, too, dismissed 5-14 tests as unreliable and never designed for the purpose for which they were being used.
Ewan Aitken, education spokesperson for the local authorities, observed:
“A-E level tests are tools to assess the progress of individual children.
They were never designed to assess the progress or effectiveness of a school. They are not sat under exam conditions, nor are they externally moderated. They are a teaching tool, but no more than that.”
Mr Aitken said the real test was how well pupils did after leaving school.
“To suggest that because not everyone is at the same stage at the same age means the school has failed is self-evidently inaccurate.”
Commenting on the latest batch of data, Peter Peacock, Education Minister, said: “Any serious observer of education knows a school’s performance cannot be judged on attainment alone. There are a whole host of factors which contribute to a school’s success, and only by looking at them all can anyone get a true picture of how a school is performing.”
The Scottish Executive no longer collects 5-14 test results, although local authorities probably will continue to do so until some improved system finally emerges. Next month, for example, the Executive is set to release its first data from the pioneering Scottish Survey of Achievement (SSA), which replaces national data on performance from the Assessment of Achievement Programme (AAP).
The first results in English and core skills at P3, P5, P7 and S2 will show how pupils in schools across half the authorities are doing. The random sample will be used for schools’ self-evaluation. The same cohorts of unnamed pupils will then be followed as they progress through school, and it will be left to local authorities to gather data on their schools and compare it to that from the SSA survey.
One insider said: “It will give us an accurate picture across the country and not just in English and maths.”
The SSA will also look at social subjects and science over the next few years but it will not offer school-level information, unlike key stage testing in England.
Leader 14
The SSA’s mandate
* Annual surveys at national level and for half the local authorities each year - but not designed to report on individual schools or pupils.
* To report on the percentage of pupils at a particular stage who have attained specific levels. For example, the percentage of pupils in P7 who are at level D in reading, the percentage who have not yet attained level D but are secure at level C, and the percentage who have attained level E.
* Compare girls and boys.
* Provide detailed information on aspects of performance and on class organisation, resources and approaches.
Keep reading for just £1 per month
You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters