Is education research getting lost in translation?

Politicians are increasingly keen to lift education approaches from countries seen to be performing well. But can teachers trust that such imports will work in their classroom?
19th December 2023, 7:00pm
Research lost in translation

Share

Is education research getting lost in translation?

https://www.tes.com/magazine/teaching-learning/general/education-research-teaching-teachers-schools-pisa

This article was originally published on 7 September 2023

Imagine a 10-year-old child - let’s call him David - sitting in a maths lesson in a primary school in Merseyside.

David’s school is in St Helens, one of the poorest areas in the UK, but the mastery approaches his teacher is employing originated much further afield: in South East Asian countries such as Singapore, where, according to the Organisation for Economic Cooperation and Development’s (OECD) Programme for International Student Assessment (Pisa) rankings, children consistently outperform those in the UK in maths.

The principles of arithmetic are the same in Singapore as in St Helens, so it stands to reason that David should gain as much from mastery approaches as his counterparts in Singapore.

But is it really that straightforward?

Does it make a difference that, for example, in Singapore, where there is a strong culture of parental help at home with maths and private tutoring, children spend an average of an extra 6.1 hours per week on maths beyond compulsory schooling, while their English counterparts spend 3.2 hours?

Does it make a difference that, in Singapore, the work permit system means that the children of many of the poorest workers are not permitted to attend schools on the island, so are not part of its Pisa scores?

Does it make a difference that the Singaporean child is not being brought up within the traditions of a Western-style liberal democracy?

Such questions are increasingly important in an education landscape where the OECD encourages international comparisons through Pisa - and the classroom practices of the systems deemed most successful become seen as models worth imitating. 

Importing education research

In England, for example, Conservative ministers have made maths mastery approaches, imported from Singapore and Shanghai, a “standard fixture” in schools, while past Pisa success has made Finland’s education system an object of fascination for Conservative and Labour politicians alike.

Often, though, some crucial questions are overlooked: when should we be cautious about trying to translate education practice, or the research that seems to back up the effectiveness of that practice, to different nations? Or when might such international translation be more likely to work?

A good example of the problems that can occur when we translate research from one system to another comes in the form of early maths teaching.

“I can see no valid reason unless more research is done - serious research is done - on why we would look to Singapore or Shanghai as the answer to our problems,” says Paul Andrews, professor of mathematics education at Stockholm University, co-author of a recent Oxford Review of Education paper looking at the importation of Singapore maths mastery to England.

The paper, written with colleagues at Malmo University, Stockholm University and the University of Leeds, analyses three textbooks currently used in teaching maths to Year 1 children in England: one an “established English-authored textbook”, the others “Singaporean-authored imports promoted by government as solutions to perceptions of systemic failure” in England.

Andrews’ paper emerged from a wider project on young children and maths, in which he and colleagues looked at educational research and textbooks around the world to identify eight learning outcomes that young children need to become successful maths learners later in life.

‘I can see no valid reason why we would look to Singapore or Shanghai as the answer to our problems’

But they found that two of these - estimation and identifying number patterns - were missing from Singaporean imports to the UK and Sweden. That might come down to differences in educational culture around maths and the experiences Singaporean children have had before Year 1.

Becky Allen, professor of education at the University of Brighton, who has studied the relationship between research and policy, argues that this gap has intuitively been felt. Even proponents of Singapore maths in England, she says, have “felt like something was missing from it”; that children were struggling, in particular, to achieve a strong sense of “arithmetic fluency”, hence the introduction of various “sticking plasters”, such as number bond checks.

Is education research getting lost in translation?


“You have to look at the cultural context,” she says. “You have to look at what kinds of pre-school experiences Singaporean children have, what kind of emphasis on numeracy within the home is going on; and what’s leading children in one context to become very fluent in the way they deal with numbers and number processes and not children in the other, having implemented exactly the same types of principles and the underlying programme.”

Andrews says he and his colleagues also found that while opportunities to develop maths competencies occur “fairly uniformly throughout the year in English and Swedish textbooks”, in the Singaporean imports “they are all over by halfway through the year…and they are moving on to much more sophisticated material”. This is an approach that “pays much less attention to the developmental needs of all children”, he argues.

That might be seen as fitting in a Singaporean system in which the children of the city state’s poorest workers are not participating.

Andrews’ research looks at figures on the Singapore work permit system and suggests that it is “reasonable to infer that the children of at least a third of the working population of Singapore are not permitted to live on the island” and study in its schools, meaning that when tests are conducted for Pisa and the Trends in International Mathematics and Science Study (Timss) - in which Singapore is also a world leader - “the children of the lowest-paid workers are systematically excluded”.

“If we removed from Pisa and Timss the children of the lowest-paid third of our workforce - acknowledging that typically there’s a strong correlation between parental income and educational attainment - what would the UK’s Pisa scores look like?” he asks (the OECD has been approached for comment).

More widely, Andrews highlights the fact that Singapore is “effectively a one-party state”, which still permits the corporal punishment of male students in schools and has a more “conformist” educational culture. 

Looking to Singapore or Shanghai is inappropriate, therefore, “because we have not evaluated [their approaches] against a genuine comprehensive intake” or against school cultures that are informed by the principles of Western liberal democracy, he suggests.

Cultural barriers

Translation isn’t only complicated when lifting practice from East to West, though; even within Europe, cultural gulfs can be barriers.

Andrews, also co-author of a paper with two Finnish academics titled Pisa, Timss and Finnish mathematics teaching: an enigma in search of an explanation, argues that what admirers of the Finnish schools system - commonly held up as a gold standard among some commentators in England - typically fail to understand is that its success is down to its “largely untransferable cultural expectations”.

That, he continues, includes a “deep-seated cultural tradition in which reading is extremely highly privileged”, stretching back to post-Reformation Finland, when people had to be able to read to receive the sacrament and thus to marry. Which, in a Lutheran society where sex outside marriage was strictly forbidden, was quite the incentive to learn to read.

And that cultural context for reading in Finland is a big part of the nation’s success - not just in Pisa literacy scores but in maths, too, Andrews contends, given that Pisa maths questions are “typically a word problem” where reading is often more important than maths.

But does the existence of cultural differences like these mean that teachers should always be wary of imported approaches?

According to Allen, there are still valuable lessons that we can learn from what other countries are doing, as long as we exercise a bit of caution. What we need to do, she says, is distinguish “between the things that make transfer possible and the things that make it difficult”.

While “more fundamental theories about learning in relation to cognitive science or neuroscience [have the potential] to transfer really well”, she suggests, it is crucial that we think about factors such as different traditions in education practice, “the social norms or culture of the country” or “the prior knowledge of the children”, where small differences can have a big impact.

It’s a big ask to expect teachers to analyse all these factors themselves, but what they can do is look to experts who consider the range of international research on education and try to work out what will and won’t transfer to their country.

For instance, the Educational Endowment Foundation’s (EEF) Teaching and Learning Toolkit summarises high-quality education research from around the world, with the aim of supporting teachers and school leaders who are looking for ideas to improve teaching and learning, particularly for disadvantaged children. 

Is education research getting lost in translation?


Jon Kay, head of evidence synthesis at the EEF, says that when his organisation is looking for evidence that could offer guidance in the UK, “the first port of call would normally would be a high-quality trial that’s happened in English schools”, such as an EEF evaluation.

But often there are areas that schools are interested in where UK research hasn’t yet been carried out; in this case, the team will usually look to the United States, where income levels are broadly comparable with the UK’s - certainly more comparable than with developing nations’.

For Kay, “what we try to do in EEF research is to have a process evaluation where we are always trying to capture not just ‘has this worked?’ but to use qualitative evidence to understand…the mechanisms that have led to this change”. 

Ultimately, he says, the aim is to encourage school leaders and teachers to “not just take as read this has been tested somewhere else and it works”, but to think: “there’s promise, it has worked, but why has it worked and can that be replicated in my setting?”

Being cautious

When it comes to trials, sometimes the EEF has found that what succeeds in the US doesn’t seem to translate to the UK. For example, the EEF has carried out several studies on social and emotional learning programmes, such as Positive Action, that were found to have a significant beneficial impact in the US.

“The actual results we got when we tested in English schools didn’t replicate those [US] results,” says Kay. “They had null results, typically.”

The EEF’s evaluations suggested that “if you think you can get a really Americanised programme and bring it over to English schools without changing the language” and without tailoring it to the teachers actually teaching the programme, “then it’s much less likely to be implemented with fidelity and much less likely to have the impact”, says Kay.

Trying to understand the “causal mechanisms” of what has made an educational approach work in one country is key to thinking about whether it will transfer to another, says Steve Higgins, professor of education at Durham University and lead author of the EEF Teaching and Learning Toolkit.

He uses the example of the Teaching at the Right Level method developed in India, which deployed teaching in similar ability-level groups to help improve basic literacy and numeracy. Its success was backed up by randomised controlled trials.

But when it was transferred to other nations, including in trials in Ghana, “the results were much more modest”, says Higgins.

“People thought they understood the causal mechanism - which was thought to be grouping pupils in more similar, more homogenous groups,” he explains. But, actually, the pedagogy used in India was more complex: “Quite often, there were additional hours in the school day during which volunteers would come in and teach small groups, or there might be a 10-day intensive summer school,” says Higgins.

‘What we need to do is distinguish between the things that make transfer possible and the things that make it difficult’

The US Department for Education’s Institute of Education Sciences has operated the What Works Clearinghouse (WWC) since 2002 - with a remit to “review the research, determine which studies meet rigorous standards and summarise the findings”. But it is only in the past 10 years or so that the WWC has begun looking at international evidence - and ensuring that studies are as comparable as possible is a cornerstone of its approach. 

Its protocols state that to be eligible for review, a study must include students in the US, its territories or military bases, or in other OECD member countries “in which English is the primary or most commonly used language (that is, Australia, Canada, Ireland, New Zealand or the United Kingdom)”.

“The big advantage of being open to international evidence is there are opportunities for additional variations in policy, as long as we have confidence that the settings and the populations of students are comparable to what we might have in our own context,” says Jonathan Jacobson, WWC branch chief.

However, there are drawbacks to this approach: it’s notable that the WWC protocols state that it can only review international research where the studies are from English-speaking OECD nations.

“I’ve wondered, ‘What if we find a really interesting study from Singapore? What will we do?’” says Jacobson.

Meanwhile, in the Netherlands, the EEF’s toolkit has been published in translation, without changes, but with context for Dutch teachers, explains Anne Luc van der Vegt, education expert at the Netherlands Initiative for Education Research (also known as Nationaal Regieorgaan Onderzoek, or NRO).

But NRO will also be developing its own domestic studies, along the lines of the EEF, because not all research translates, he adds.

For example, setting and streaming is a big theme in the EEF toolkit, notes van der Vegt - but there’s a very different picture in the Netherlands, with its separate tracks for vocational and academic education at secondary level.

Is education research getting lost in translation?


“We have to present our own information about our system, and the benefits and hazards of our system,” says van der Vegt. “So we combine international research and Dutch research.”

On themes such as “feedback or metacognition or self-regulation, the danger is much less present that international research won’t be applicable to the Dutch situation”, he says, “because it happens in the classroom and those classrooms aren’t very differently organised in the UK than in the Netherlands”.

Making sure that the contexts are as comparable as possible is a big part of how we determine whether translating practice from one country to another is likely to be a success, then.

But if you can get those comparisons right, there are demonstrable benefits to looking at what other countries are doing.

For example, Kay points towards the concept of texting parents about homework, tests and how children are faring at school, with the aim of improving parental engagement and pupil attainment. This approach, which originated in the US, is now widespread in UK schools.

“A lot of those original experiments came from teams at Harvard University and the University of Chicago, looking at the way they would apply nudge principles or policies to changing parental behaviour,” he says.

“When we tested that [in the UK], we had a similar positive result and improvements in attendance and improvements in maths outcomes.”

It’s not only successful interventions that we can learn from, says Higgins: there have also been instances where international evidence has helped to avoid implementing approaches that seem destined to fail. He gives the example of an idea floated for pupils in England to retake the final year of primary school if they had fallen behind in key stage 2.

“The evidence in the Toolkit overall is that repeating a year tends to be negative and it is expensive,” says Higgins. He believes that this helped to ensure that the idea was dropped “before it had a chance to become policy”. 

He also points to the trend in England of training teaching assistants to do small-group or one-to-one work, on a time-limited basis - rather than giving general support in the classroom. 

The EEF conducted trials showing the benefits of that approach; trials that were “consistent with wider international evidence”. That is one area “where not only have we been influenced by the international evidence, but we’ve influenced the international evidence”, says Higgins.

How teachers can learn from other countries

When it comes to teachers learning lessons from other nations, then, what might be the way forward?

Thinking about the cultural and educational fit could be one key factor.

For example, Andrews draws attention to Flanders, the Flemish-speaking region of Belgium, which is a top performer in Pisa and Timms - though that is hidden in Pisa by the fact that Belgium, rather than Flanders, features in the main rankings.

Flanders is “Europe’s most successful system”, argues Andrews. And as a multicultural society with a comprehensive school system, it has cultural similarities with the UK.

Singapore might fit with the ideals of ministers “but, culturally, Flanders is a much more interesting venue for transferable insights,” says Andrews.

“We should be looking at Flanders. And nobody looks at Flanders,” he adds.

Another step might be to think more about the true nature of evidence and research, suggests Allen.

“When we look at evidence of things that work, we tend to jump on things that appeal to our own preconceptions about what we believe to be true,” she says.

“And we often read evidence superficially - either because we are rushing to seek that confirmation or because politicians and, indeed, teachers who are interested in policy are not generally experts in research methods.”

ED Hirsch’s book Cultural Literacy: what every American needs to know, whose 5,000 facts drove the knowledge-rich curriculum in English schools via its influence on former schools minister Nick Gibb, rests on one quite weak piece of empirical evidence on French curriculum reform, says Allen.

“Those of us who care about research methods will be reading it in a very different way,” she adds.

When looking at international evidence, Kay says the EEF would encourage teachers not to think “this is from America so it could never work”, but to think, “OK, why is it this has had a positive impact in America?” and consider whether or not there are unique cultural factors in the US that might be a barrier to translating to the UK.

‘Evidence shouldn’t be used to undercut the professional expertise of teachers’

“If evidence is to succeed, we need to accept we can learn things from other contexts, and that learning is valuable and can help improve…but also that evidence shouldn’t be used to undercut the professional expertise of teachers,” he continues. “The best way for evidence to function, either in England or globally, is for teachers to really use their professional expertise when applying it.”

Where scientific research is about the incremental accumulation of knowledge, says Higgins, education research is more like a “mapping exercise where you’ve got to test your assumptions every now again, either against the existing evidence or try to bring about a process of change to see if you’re successful”.

It’s highly unlikely there’s some silver bullet out there that is going to have a dramatic impact on attainment - but international evidence might offer guides to small improvements in particular areas, he argues. 

That means trialling approaches seen to be successful overseas in the UK “just to be sure they are successful - it may be that in the UK context they are not, or they don’t give you the big gains people think they do”, says Higgins. 

“I see findings from educational research as increasing and informing the range of choices that teachers can make,” he adds. 

Thinking there are no lessons to be learned from other nations in education means missing out on valuable learning. But, on the other hand, thinking that everything can be imported wholesale risks imposing overseas pedagogy or curricula fostered by unique cultural traditions in contexts where those cultural traditions do not apply.

In a world where Pisa drives policymakers’ anxieties about international comparisons, it’s important to think critically about “miracle” solutions from overseas, and to understand when those apparent solutions might be shaped by domestic cultural factors, stretching back hundreds or thousands of years.

In the end, it’s a long way from Singapore to St Helens, in every sense.

John Morgan is a freelance journalist

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

topics in this article

Recent
Most read
Most shared