Research on the impact of educational initiatives can yield useful results. The significance of a finding rests on how confident we can be that the results would be the same if we repeated the study. Conclusions gain greatly in value if they can be generalised and extrapolated beyond a single study, and this is more likely when the samples used in such studies are substantial in size, and the research is methodologically rigorous.
Problems arise when rigorous studies are yanked out of context and freighted with conclusions they were never intended to bear. Individual research studies risk being amplified out of proportion by educationalists with ideological axes to grind. For all their rigour, large-scale studies are not immune to misapplication.
The Education Endowment Foundation’s recent study of MathsFlip is available online. In this randomised controlled trial over a year, pupils in 24 schools (half of them in a control sample) undertook online activities at home, with lessons devoted to reinforcing the learning. Baseline tests, taken in Year 5, were compared with the outcomes from key stage 2 tests in Year 6.
The finding that the MathsFlip programme had only a small impact (estimated at one month of additional progress - although it was somewhat higher for disadvantaged pupils) has been seized on by those predisposed to dislike the hype behind flipped learning; while those invested in the approach have cried various forms of foul.
Flipped learning - love it or hate it?
What started as a study of the impact of one specific application has been generalised into a judgement on flipped learning in toto. This is unfortunate because attitudes to flipped learning have tended to polarise between boosters and belittlers.
The EEF study in isolation, as robust as it is, yields little information on whether flipped learning works. A study of a maths programme for KS2 pupils says a lot about what to expect if you try to do the same thing with pupils of the same age (and under the same conditions). It didn’t set out to say anything about other ages and stages, other subjects or other forms of flipped learning (which, after all, is a very broad church).
John Blake, of Policy Exchange, was quoted criticising flipped learning on the grounds that it doesn’t work, and is likely to reinforce existing educational disparities (arguments which seem to contradict themselves). He is an advocate of a knowledge-rich approach to teacher-led, classroom-based instruction. He might be right in his scepticism about flipped learning in general, but the EEF study is not in itself evidence of the inadmissibility of the approach.
Like many “robust” RCTs, this EEF study focused only on immediate outcomes, as measured in standard tests, not complemented by an observational analysis of the attitudes and engagement of pupils. Many of the teachers involved testified to changes in attitude and engagement from pupils - outcomes that might easily outlast the end-of-trial tests. This should give even greater grounds for caution in leaping from this study to the dismissal of the entire flipped learning “project” - whatever that might be.
Kevin Stannard is the director of innovation and learning at the Girls’ Day School Trust. He tweets as @KevinStannard1
Want to keep up with the latest education news and opinion? Follow Tes on Twitter and like Tes on Facebook