- Home
- Teaching & Learning
- General
- How to ensure effective CPD in schools
How to ensure effective CPD in schools
Most teachers want to be better teachers; they want to improve. But most teachers are also rightly sceptical about some of the ways in which they’re asked to improve.
Last year, a Teacher Tapp poll found that free food for teachers was as popular a proposal as three days’ professional development (PD) of their choice, while a visualiser or an extra free period every week were substantially more popular.
There’s no question that teacher PD gets a bad rap. But is that fair?
We have good evidence that, on average, PD helps teachers to improve. But the effects are variable, and our previous assumptions about which types lead to improvement and which don’t have been found wanting.
In 2018, we conducted research that found a consensus had developed among researchers - a consensus that informs the standards for teacher PD, as set out by the government. The assumption underpinning this consensus is that PD for teachers works better if it’s sustained and collaborative, subject-specific and practice-based, and is supported by external expertise and teachers’ buy-in.
Yet in our 2018 study, we discussed programmes that did all these things but didn’t appear to work. And we found programmes that didn’t do everything on that list but did work. This was a problem.
Two issues seemed to be driving this discrepancy. First, the studies underpinning the consensus were small and often used weak research designs. As a profession, we were more confident about the power of this consensus than we should have been.
Second, reviewers had looked for the common characteristics of PD programmes with positive effects. Unfortunately, it does not follow that these elements are therefore key to good PD. Effective programmes might share a characteristic - such as drawing on external expertise - by chance.
So our conclusion was that, although we thought we all knew what made great PD, we didn’t really. Which is why we have the problem we started with and why teachers’ criticism of the development opportunities available to them is probably fair: it’s not just demonstrably ineffective PD that drives poor engagement from teachers, it’s the stuff we think works, too.
So, what should we be doing instead? Not wishing to point out a problem and run, with funding from the Education Endowment Foundation, we formed a team - together with Alison O’Mara-Eves and Sarah Cottingham - to conduct a new, systematic review of the evidence. This review has informed a new EEF guidance report, Effective Professional Development, published today.
First, we set out to find every relevant study about teacher PD that we could. We had two criteria: we would include only randomised controlled trials (the best way to see the effect of a PD programme is to select some teachers or schools to experience it at random while an equivalent group do not experience it); and we limited ourselves to studies that measured student learning, which we see as the ultimate goal of teacher PD. That left us with 104 relevant studies, focusing on teachers of a range of pupil ages and subjects.
The programmes that were evaluated by these studies also offered teachers varying combinations of tasks and training, coaching and resources. Basically, we had a strong, varied evidence base to work from.
Transferring this evidence into practical, actionable advice, though, was tricky. In the end, it required us to develop a novel theoretical model based around two building blocks: purposes and mechanisms.
Building block one: purposes
To get fit, I need a reason to act (feeling unfit), a goal (to be able to run 5k), a training plan (short- and medium-term goals, stretches, less wine), and support to follow it (cheering masses along the tracks of our local park).
It’s the same elements that are needed for any change effort: we must first understand the value of making that change and be motivated to make it, know what we need to do to effect it, and then we need to stick with the change.
Based on this, and on forthcoming work from Josh Goodrich at Steplab, we predicted that a PD programme was most likely to help teachers to change if it had a balanced design that addressed the following four purposes:
- Developing teachers’ knowledge around teaching and learning.
- Motivating goal-directed behaviour: encouraging teachers to act on this knowledge.
- Teaching techniques to put these evidence-based knowledge to use.
- Embedding these changes in teachers’ practice.
For example, a PD programme might teach teachers about dual coding (knowledge), convince them that dual coding is worth pursuing (motivate), teach them not to speak over written text (technique) and support them to keep doing this (embedding in practice).
We predicted that a programme was less likely to work if it failed to address all four of these purposes. For example, new knowledge without an associated technique would lead to a gap between insight and action. Likewise, new knowledge and a new technique, without support to embed this in practice, would likely lead to temporary change, followed by a reversion to old habits.
Building block two: mechanisms
Once we had set out these purposes, we considered how a successful PD programme would address them: what mechanisms would it use to achieve each purpose?
By “mechanism”, we mean something that causes a change. For instance, to return to my fitness metaphor, one mechanism to help me get fit would be setting a reminder to go for a run: this would help me to embed my training plan in practice.
Often, in education research, we talk about mechanisms in vague terms, using words like “collaboration”, “support” and “challenge”. Here, we needed precise and unambiguous definitions that would allow us to identify mechanisms in studies accurately - and we wanted leaders and training providers to be able to do the same.
We therefore began with a list of mechanisms that help people to change, developed by psychologists Susan Michie et al (2013), and made some small adaptations to suit the sector we were looking at. For example, we got rid of mechanisms that weren’t relevant (we’ve not yet seen an experiment asking teachers to wear heart-rate monitors to help them improve), and adding some that were missing, such as revisiting past learning. We then came up with a final list of 14 mechanisms.
Evaluating the studies
Armed with these two blocks, we began to identify their features in the 104 PD studies we had identified. For example, a careful reading of the My Teaching Partner coaching programme showed that it addressed all four purposes through eight mechanisms. These included:
- Building knowledge (purpose) through coaches revisiting material (mechanism), returning to different elements of the course material across the two-year programme.
- Motivating teachers’ goals to change behaviour (purpose) by agreeing goals of ways to improve student-teacher relationships during their meetings (mechanism).
- Teaching techniques (purpose) by explicit modelling (mechanism): teachers were shown videos of effective teaching during a training session and had ongoing access to a library of videos highlighting particularly effective practices.
- Helping teachers embed practice (purpose) by promoting self-monitoring (mechanism), asking teachers to observe specific parts of a video of their teaching, and consider the link between their actions and their students’ reactions.
Applying this process to every study meant we ended up with a database that included the mechanisms, the purposes and the impact of each. We could then use this database to test our predictions.
So, what did we find out? As we had predicted, programmes using more mechanisms were more likely to improve student learning. On average, a programme with no mechanisms would have no impact; a programme with 13 mechanisms would have an impact equivalent to two months’ additional pupil progress.
Next, we looked at whether programmes with a balanced design - incorporating mechanisms addressing all four purposes: knowledge, motivation, techniques and practice - were more effective. We found that balanced designs had a three-times higher impact, on average. While there is still some statistical uncertainty about this point, based on our research, we believe the best bet is to adopt a balanced design.
Finally, we set out to determine whether there was one form of PD that would be more effective than others. We used our mechanisms and the existing literature to define three forms of PD that teachers would be likely to encounter:
- Instructional coaching, as including (as a minimum) the mechanisms goal setting, feedback, instruction or modelling, and rehearsal or practice.
- Lesson study, as including action planning, practical social support and feedback.
- Teacher-learning communities, as including practical social support, action planning and goal-setting.
Comparing programmes of each of these three forms, we found that the average impact was similar: equivalent to around one month of additional pupil progress. None of the three forms was clearly more effective than the others. But we also found that - in each form - a programme was more likely to have an impact if it included more additional mechanisms. That is, to be classed as instructional coaching, a programme had to include goal setting, feedback, instruction or modelling, and rehearsal or practice. On average, the more mechanisms a programme had on top of these, the more effective it was.
Was it possible to have too many mechanisms? Not that we found, according to our data. Given that we only had 14, though, we’re probably not yet at the ceiling for the number of valid mechanisms.
You might be thinking that this was an interesting intellectual exercise but what are the practical takeaways? We think there are four key points you should consider as a creator, commissioner or consumer of PD.
1. PD works, but not always
The programmes in our sample had an average effect equivalent to approximately one month of additional progress. However, this average hides wide variation in quality - much PD probably has zero impact.
Therefore, we cannot say that any PD is automatically a good thing. Teachers and leaders need to be critical as they commission and design it, if they are to ensure that it causes improvement, and justifies the financial and time cost.
2. Stop arguing about forms
There has long been debate around the merits of, for example, lesson study as opposed to instructional coaching. Our evidence suggests that this isn’t the right distinction to be arguing over; on average, the two approaches have roughly the same impact.
While the terms remain useful, as they capture and communicate a distinctive approach, pitting one against another is unlikely to shed much light on how best to improve teacher learning.
The form is not important. What matters is how a programme is designed: the purposes it addresses and the mechanisms it includes.
3. Think hard about the mechanisms included in PD
The more mechanisms a programme includes, the greater its impact tends to be. When designing or commissioning a programme, leaders and providers should ask: “How many mechanisms are included?”
If the answer is “not very many”, this list provides a starting point for improvement. For example, could you:
- Call on a credible source to introduce new ideas?
- Invite teachers to make detailed plans for action, including “how” and “when” ?
- Encourage teachers to keep practising new techniques in their lessons?
Remember that we only included mechanisms on our list if we found evidence that they worked. There were plenty we thought about including but couldn’t find evidence for. Just calling whatever we like a mechanism - collaboration, support, challenge - won’t improve PD.
4. Create balanced programmes that address all four purposes
Programmes with more mechanisms seem to work better but, in designing PD, more isn’t necessarily better in every sense. If we are trying to address too many different purposes, we may end up with a programme that is complicated, expensive and time consuming.
Instead, consider what you are really trying to achieve. Programmes that address all four purposes of PD are more likely to lead to increased student learning. So, in designing or choosing PD, ask whether you have incorporated mechanisms that:
- Promote evidence-informed insights (knowledge) for teachers.
- Motivate teachers to pursue specific goals.
- Teach teachers techniques to achieve those goals.
- Help teachers embed those techniques in their practice.
So, where does all this leave us? To bring it back to the point we made at the start of this piece: most teachers want to get better at what they do. And the good news is that this latest research adds to our evidence base for PD being an effective way to help them to do that.
At the same time, we can’t continue to assume that just any professional development is useful. This latest study still doesn’t give us all the answers but we hope that it offers the building blocks to help schools to improve their provision further - and come one step closer to finally shaking off those negative perceptions that have dogged teacher PD for far too long.
Harry Fletcher-Wood is associate dean at Ambition Institute and Sam Sims is a lecturer at the Centre for Education Policy and Equalising Opportunities at University College London
This article originally appeared in the 8 October 2021 issue under the headline “How to be a better teacher”
You need a Tes subscription to read this article
Subscribe now to read this article and get other subscriber-only content:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters
Already a subscriber? Log in
You need a subscription to read this article
Subscribe now to read this article and get other subscriber-only content, including:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters
topics in this article