Beware: research facts aren’t always what they seem
Major research flaws often go unnoticed and are passed along until they become received wisdom, says Christian Bokhove, who points to the academic urban legend surrounding spinach
If, like me, you like to know the evidence for any claim made about education, research articles can be incredibly frustrating. Quite often, there is only a vague semblance of a breadcrumb trail back to the source - a link to a book on Amazon, perhaps, which is pretty much useless. Sometimes there is no trail at all. You can find yourself scrabbling around in the dirt trying to find out whether what someone is saying is actually true.
It’s worthwhile work, though. And there’s a very good example of why, from the scholar Ole Rekdal. For years, it was thought that spinach contained a lot of iron. Countless studies referenced the claim - it was established knowledge. That is, until a group of intrepid explorers followed the trail of evidence and discovered that this was wrong: it was asserted that a shift in the decimal place of a number had given a false certainty.
So it was a myth? Yes, but not for that reason. Much later, a new group of evidence hunters followed the trail of the “shift in the decimal place” claim and found that its actual origin lay in a Reader’s Digest article that had no foundation in science whatsoever.
How did this happen? Over the years, several people - academics and laypeople alike - had “lazily, sloppily or fraudulently employed sources, and peer reviewers and editors have not discovered these weaknesses in the manuscripts during evaluation”, according to Rekdal.
Another recent example of received wisdom can be found in ED Hirsch’s book, Why Knowledge Matters, as I discovered when I tried to replicate one of the diagrams about France in chapter seven. In the chapter, Hirsch argues that changes in the curriculum in France - a move from a “knowledge curriculum” to a “skills curriculum” - not only decreased performance but increased inequality in the country. Despite the challenges of defining what a “knowledge curriculum” and a “skills curriculum” are, and the always tricky elements of correlation and causation, I followed up the raw data.
Using my basic grasp of the French language, I was able to find the data and re-analyse it. Some choices in the graph were unclear so I had to guess, but I discovered that the graphs were missing some crucial information, which meant that the conclusions Hirsch reaches may be slightly inaccurate. Despite this, his analysis of France is quoted far and wide in education. And, it seems, no one has bothered to check the sources (in his most recent book, he has corrected some of these issues).
Now, I’m not suggesting that teachers have the time to chase up all of these sources. Even with a trained academic eye, it can take hours to properly analyse a research claim. But what we need to do is recognise that all we read may not be as it seems. We need to be cautious. We need to be critical. Or we will all be eating the education equivalent of spinach without any hope of getting the iron we were promised.
Christian Bokhove is associate professor in mathematics education at the University of Southampton and a specialist in research methodologies
This article originally appeared in the 2 October 2020 issue under the headline “Cast-iron facts aren’t always what they seem”
You need a Tes subscription to read this article
Subscribe now to read this article and get other subscriber-only content: