- Home
- Teaching & Learning
- Early Years
- Guided play: the problems with the research
Guided play: the problems with the research
A team of researchers from Cambridge recently set out to compare the impact of free play, guided play and direct instruction. The conclusion that guided play is “sometimes better” caught my attention because it fed my existing beliefs.
However, I always try to scrutinise such studies closely to avoid research just confirming the ideas I already hold.
For this study, the researchers conducted a systematic review: a thorough way of finding, appraising and synthesising existing studies. This process takes a team many months, and this review examined over 1,500 studies. After screening, 17 were combined statistically using meta-analysis.
There were also some specific limitations that I think are worthy of note.
Limitation one
“Garbage in, garbage out” is a classic cry of the armchair critic and is a criticism that will be familiar to anyone involved in reviewing research.
Still, unfortunately, it is relevant here since, in my view, the 17 studies used in the meta-analysis are pretty poor. The review team rightly identified the high risk of bias.
Other issues are that some studies involved researchers or parents, so these findings may not apply to teachers. Most studies also asked the participants to measure children’s learning. Such self-reported outcomes tend to find bigger effects than more objective measures.
Some researchers embrace this criticism and acknowledge that good research reviews involve effective “waste management”. For example, John Hattie’s influential text Visible Learning is often criticised for its relaxed stance on quality.
- Guided play: what is it and why does it matter?
- Guided play ‘sometimes better’ than direct instruction
- How to evaluate your classroom environment
In contrast, The Institute for Education Science’s stringent standards means that their reviews routinely include few studies.
The Cambridge team went for a middle option of trying to interpret the best available evidence. Still, the limitations I’ve described, and others, should be considered alongside the team’s conclusions.
Limitation two
Randomised controlled trials usually compare something new to what was happening before the researchers arrived, known as “business-as-usual”. Unfortunately, we often don’t know exactly what the business-as-usual is and it won’t be the same in each setting.
In this case, the researchers aimed to compare the use of free play, direct instruction and guided play. The reality is that in most of the studies they looked at, business-as-usual probably meant teachers using a combination of all three of these approaches.
Yet, the review grouped business-as-usual with direct instruction, which compromises the main claim that “guided play is sometimes better than direct instruction”. We could restate the claim as “guided play is sometimes better than direct instruction and/or business-as-usual”. Catchy, right?
Making matters worse, direct instruction has multiple meanings. Play is also tricky to pin down, as described in a recent Tes interview with one of the review team.
Limitation three
Another common criticism levelled at research is that you can’t compare “apples and oranges”.
This means that reviewers should only compare similar studies: apples with apples or oranges with oranges, not a mixture of the two.
The challenge for researchers is to combine studies in meaningful ways guided by their research question. This requires professional judgement and reasonable people may always disagree with the judgements taken.
The Cambridge review undertook twelve separate meta-analyses using just 17 studies. It is unusual to split the analysis like this when there are so few studies available. Three separate meta-analyses compared direct instruction and guided play on different maths outcomes. Splitting up maths like this is akin to separating the Braeburn and Gala apples.
Across these three maths outcomes - which are the primary basis for the headline claims - just five studies were included. Confusingly, the same “mathematical checklist” outcome from one study was included in two of the three maths meta-analyses. So, in the process of meticulously separating the apples, one was classified as both a Braeburn and a Gala.
A single meta-analysis for the maths outcomes would have probably been more appropriate. I’d even argue for combining the maths and literacy outcomes into a measure of attainment given the small number of studies found.
Cynically, the fragmented analysis makes me worry about analytic flexibility; this is where researchers rummage through the data to reach a favourable conclusion. This can happen unintentionally, which seems plausible in this case.
Research on guided play: what can we take from it?
So, what does all of this mean for what we can take from the study?
Whatever your current views on play, I believe you probably shouldn’t update them much based on the review, except perhaps to reduce your overall confidence. That does not mean your current ideas are right or wrong, just that the available evidence is extremely limited.
It also does not mean the review is useless. The review has many strengths including a pre-registered protocol setting out how the researchers planned to do the work, and they have shared detailed supplementary information, which makes it possible for readers to scrutinise claims closely.
It is also very useful to know what we do not know. Nonetheless, scrutinising claims closely is critical, particularly when they feed our existing prejudices.
Thomas Martell is a secondary biology teacher and research school director based in Durham
You need a Tes subscription to read this article
Subscribe now to read this article and get other subscriber-only content:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters
Already a subscriber? Log in
You need a subscription to read this article
Subscribe now to read this article and get other subscriber-only content, including:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters
topics in this article