- Home
- Why ‘The evidence says...’ can be a dangerous statement
Why ‘The evidence says...’ can be a dangerous statement
“There are three legs on the stool of education,” a tutor informed me on the first day of my PGCE: “research, policy and practice. Researchers are out of touch with practice, practitioners have no time for research and policymakers know nothing about either.”
These days, the phrase, “The evidence says…” is heard increasingly often in schools. Does that mean the teaching profession is engaging more with all that research out there? Is our practice finally becoming more evidence-based? I fear not.
A useful test of this is to reply to the nugget of supposedly evidence-informed wisdom being offered with, “What evidence is that?” Sadly, the citations are often less than forthcoming.
Prejudice, dressed up as evidence
The impulse is a good one, and to be encouraged. It is important that we move away from doing things because that’s how we’ve always done them, and find out what actually helps children to learn.
The danger is that we now find ourselves drifting into a worst-of-all-worlds scenario. In this scenario, ideas spread relatively unchallenged from school to school, because they come dressed up as evidence-based, but are in fact as rooted in prejudice and anecdote as ever things were.
Teachers trotting out a notion half-remembered from a course they once went on or read in the headline of a magazine article is not the critical, active engagement with research literature that we so sorely need.
The question “What evidence is that?” may seem a little confrontational, but it is an important one. Often, these supposedly evidence-based claims have their origins in a fair characterisation of something that has been researched. The problem is that, without knowing what the evidence actually tells us, well-meaning teachers risk doing the children we teach a disservice, albeit in good faith.
The great threat to evidence-based teaching is no longer people not caring about research - it’s myth dressed up in the livery of evidence.
If it sounds overly simple...it is
We’ve all heard them. “Apparently, the evidence says that mixed-ability teaching is better than grouping by ability.” Sound overly simple? It is. The picture is far more complex and interesting than that.
Ability grouping of various forms was in vogue in British schools for such a long time, and so there is a wealth of evidence out there about this subject. Ability grouping within primary mathematics has been shown to have a moderate positive effect size. There is some reasonable evidence that setting for mathematics and reading can have an overall positive effect when combined with an accelerated curriculum for higher-achieving groups.
Conversely, tracking children into fixed sets for a subject, and then teaching them all the same lessons, has been demonstrated to have a negative effect, especially for those in lower-achieving groups. The fixed streaming of children across subjects has been shown to have a negligible impact on everyone involved.
All these forms of ability grouping can be used well and can be used dreadfully. So is mixed ability teaching better? It’s just not that simple.
Teaching assistants: good or bad?
“Apparently, the evidence says teaching assistants are bad for children’s progress.” “What evidence?” I hear you ask. Good question.
Certainly, meta-analyses have shown effect sizes pretty close to and sometimes even below zero when correlating progress against support from teaching assistants.
We need to be cautious with meta-analyses, however - we need to treat it as what it is and not try to read more into it than the evidence is actually saying.
Data may show children who work with TAs may not make a relatively huge amount of progress. But remember the rallying cry of the statisticians: correlation is not causation. Which children in schools do TAs generally get directed to work with? Disproportionately those who are making less progress.
Then, we have the longitudinal deployment and impact of support staff (DISS) project. The DISS project did control for such variables and showed some pretty damning results. It did not, however, look at the impact of specific TA-led interventions or support forms. Nor did it look at the level of training or experience of the TAs involved.
The Educational Endowment Fund, on the other hand, has described moderately positive effects, where TAs deliver one-to-one or small-group support, which they are specifically trained to deliver under teacher guidance. There is also evidence showing that some forms of TA deployment can positively impact teacher wellbeing and pupils’ attitude to learning.
So how effective is the use of TAs? It depends.
Critical consumers of research literature
This list could go on and on: class sizes, thematic curriculums, growth mindsets…and did somebody say learning styles?
Sadly, the issues surrounding the three legs of the stool have not gone away. Teacher training generally puts very little emphasis on training teachers to be critical consumers of research literature, and there is far more work to be done in creating channels of easy access to this research for the profession.
But we need to be the driving force behind the cultural change that education needs. The prize is greater autonomy for the profession, enlightened decision-making and better outcomes for our pupils.
In some ways, we have taken the first step: people are talking about evidence. Now, if we are going to stop pedagogical memes from propagating unexamined through our staffrooms, we need to start asking each other where that evidence is.
Thomas Kent is a middle leader at a primary school in Southend-on-Sea
Keep reading for just £1 per month
You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters