Exclusive: Pisa data may be incomparable, Schleicher admits

Falls in Asian high-flyers’ math scores could simply be down to switch to computer tests
24th March 2017, 5:31am

Share

Exclusive: Pisa data may be incomparable, Schleicher admits

https://www.tes.com/magazine/archive/exclusive-pisa-data-may-be-incomparable-schleicher-admits
Thumbnail

The man running the world’s most influential education study has admitted that seemingly dramatic changes in performance for top-ranked countries shown by its “comparable data” could, in fact, be explained by changes to the way its tests are delivered.

According to the latest Programme for International Student Assessment (Pisa) study from 2015, the global top-ranked performers in maths have all seen falls in the percentage of their pupils achieving high test scores in the subject over the previous three years.

That apparent decline in the ability of East Asian maths superpowers to stretch the brightest could have wider implications. Schools in the US and the UK have invested heavily in emulating the Asian maths “mastery” approach.

But now Andreas Schleicher, the official in charge of Pisa, has said that this fall may not be due to a drop in the performance of these Asian powerhouses. He said he was looking into whether the decline could be explained by the fact that Pisa used computers for the main tests for the first time in 2015.

In other words, data that is clearly presented as “comparable” in the study may not be comparable at all.

The admission has led critics to question the whole reliability of Pisa and to call for the Organisation for Economic Cooperation and Development (OECD), which runs the study, to be more open about its limitations.

According to the 2015 Pisa study, all six of the top-ranked systems in maths with comparable data saw falls in the percentages of their pupils with top levels of attainment in the subject, compared with the previous 2012 study.

South Korea, Taiwan and Hong Kong experienced respective declines of 10, 9 and 7 percentage points in the proportions of students with the ability to select and evaluate appropriate strategies for complex problems. Singapore, Japan and Macao also saw drops.

China, the other top-ranked performer in maths, had no comparable data as it entered only as Shanghai in 2012.

When asked why he thought these education superpowers had all dropped in performance, Mr Schleicher admitted they might not have done so at all. The OECD education director suggested that the move to computer-based tests might be the reason.

“Further analysis is needed to establish the causes of decline in the share of top performers in some of the highest-performing countries,” he said.

He said although the study had ensured that, “on average”, pupils taking paper- and computer-based tests scored the same, that might not be true for some groups of high-performing pupils.

“It remains possible that a particular group of students - such as students scoring [high marks] in mathematics on paper in Korea and Hong Kong - found it more difficult than [students with the same marks] in the remaining countries to perform at the same level on the computer-delivered tasks,” he said.

“Such country-by-mode differences require further investigation - to understand whether they reflect differences in computer familiarity, or different effort put into a paper test compared to a computer test.”

But there is no mention of that possibility alongside the data in the report showing the change in the percentage of top-performing students between 2012 and 2015. The report clearly says the data is “comparable”.

The possibility that the change to computer tests could have made a general difference is covered elsewhere in the study, but then largely discounted.

Pasi Sahlberg, an expert in global education reform, said that Mr Schleicher’s admission could have wider implications.

“It raises new questions about the reliability of the [Pisa] test itself,” he said. “Students’ measured literacies in reading, mathematics and science should not depend on how they are measured, if the scope of testing remains the same.”

This is an edited version of an article in the 24 March edition of the new look Tes. Subscribers can read the full story here. To subscribe, click here. To download the digital edition, Android users can click here and iOS users can click hereTes magazine is available at all good newsagents.

Want to keep up with the latest education news and opinion? Follow TES on Twitter and like TES on Facebook

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading for just £1 per month

You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared