Edtech in schools: 6 steps to success
During the pandemic schools rushed to embrace edtech like never before to help to maintain as much teaching and learning provision as possible.
For many schools, this meant building on existing strategies, but others were flying blind as they entered a world they had rarely dabbled in, and had to learn as they went along.
Even before the pandemic, though, there was a recognition that edtech had a crucial role to play in education and that schools needed more guidance to get the most from its potential. This led the Department for Education to produce a vision for edtech in 2019 that sought to give the sector the guidance it needed to use technology to its advantage and spend wisely in this area.
As part of this, a major report was commissioned by the DfE to assess numerous rollouts of technology in order to draw some key insights into how to manage a successful and impactful edtech rollout. This report has now been published.
At 114 pages it’s not a quick read, but the conclusions gleaned from widespread surveys, interviews and case studies provide a useful step-by-step guide for anyone tasked with overseeing edtech in a school or multi-academy trust to ensure that they are following best practice guidance from government.
How to introduce edtech to your school
1. Identify your needs
The opening point of the report is that schools should ensure that a clear need is established for any edtech purchase - ideally, by having a clear digital strategy for how edtech complements its agenda.
The report suggests doing this by using “formal reviews, staff consultation or using edtech leads or wider staff to feed into the identification of need” so that there is a clear oversight on any new system being considered.
The report cites an example from an edtech lead at school, who explains that where before decisions were made based simply on what any individual product could do, they are now led by the school’s focus on boosting its pedagogy to ensure that any new technology is implemented for the betterment of provision.
“We pivoted from products to principles,” they say.
Another example of this approach comes from Cramlington Learning Village, a secondary school in Northumberland, which is cited as using its edtech vision - which is all about impacting “positively on learners’ daily learning experience” - when considering new products.
This meant that when the school came to looking for a new edtech tool to offer a more “interactive learning environment” that would also “provide teachers with greater feedback from learning activities to support assessment”, it had a clear guiding star for any platform it considered.
Finally, to ensure that there is a mechanism for the consideration of new tools, the report says schools and MATs should ensure there are routes for potential new products to be put forward, such as through regular formal reviews of requirements or staff consultation.
2. Make informed decisions
Following on from this is the importance of conducting research to identify as many different tools as possible to ensure that you have the best chance of finding the right tool for the job.
This can be done by seeking recommendations from education professionals, technology providers and other schools, and also via social media to see what others are using.
Cramlington is again cited as a good case study for this: the school not only invited numerous companies in to discuss their products when it was considering a new tool but it also considered whether building an in-house option was viable and more suitable.
From this, the senior leadership team drew up a SWOT analysis of the proposals to weigh them up against each other - and ensure they aligned with the aims identified. This helped to ensure that the provider they would move forward with was the best fit for them.
Conversely, not engaging in this sort of work can lead schools up a dead end, with one small MAT edtech lead admitting that the purchase of 3D printers led to numerous issues they had not been aware of - something that may have been avoided by asking others for anecdotal insights.
3. Run a pilot - and run it well
This issue also underlines why it is so important that schools carry out pilot projects with any potential technology purchases, so there they can decide whether a product or device will work as intended and uncover any problems before a big investment is made.
“Piloting with a range of users and ‘test’ and ‘control’ group approaches helped to provide rich data for evaluating suitability,” the report notes.
It says, too, that this should not be rushed because it needs proper time and focus: “Dedicating ring-fenced time to exploring and piloting new technology ensured the process was prioritised.”
This may sound straightforward but two example case studies underline that this needs careful consideration.
For example, a primary school cited in the document explains that it rolled out a pilot of e-readers at key stage 2 to boost reading. While the learning impact was seen, the subsequent school-wide rollout was a failure because a litany of issues came to light they had not considered at the pilot stage:
- How and where the devices would be charged.
- Creating email addresses for every child in the school.
- The simplification of the logging on process and password management for a key stage 1 context.
- How devices could be linked so that books could be shared across the entire year group.
- How learners could be prevented from accessing and downloading material independently without the teacher’s knowledge.
This meant the project had to be abandoned and the school attempted to recoup money lost by trying to sell the devices to parents.
A second example came from a MAT that piloted new timetabling software across seven schools, with the assumption that it would prove successful.
“The software chosen was one of the market leaders, with excellent marketing, and was the favourite option amongst those responsible for timetabling,” the report notes.
However, it did not meet the schools’ actual needs and this had not been fully scoped out to make an informed decision.
“In hindsight, our reasons for piloting that particular product were not sound enough. We went with it because it was shiny, it was one of the market leaders,” the edtech lead lamented.
A positive example of how to run a successful pilot, though, comes from Denbigh High School in Luton, which wanted to create a one-to-one device programme with its pupils.
To test this, it used one class within a year group to act as the pilot group with devices and two others without devices to act as the control group.
This enabled the school to compare the learning outcomes between the two groups and build evidence of the positive impact, and to do so at a scale that gave it the confidence that a full rollout could be managed successfully.
4. The implementation stage
If after piloting the rollout goes ahead at scale, there is still much to consider.
The report notes, for example, that schools need to develop an action plan and set expectations for use to ensure that any new devices or software are used as intended, rather than being left to gather dust.
Schools and MATs should ensure that any new technolgoy rollout is clearly communicated with a clear rationale and the benefits are made explicit, and that staff at all levels are engaged. The importance of parents being involved whenever required was also touched on, as they may well have to learn how to use any new platforms.
A good example of why this matters so much is given with a case study of a college that moved to digital assessments for its learners.
While the majority of students adapted without any issues, for some learners with special educational needs and disabilities (SEND) and foundation studies learners it proved problematic.
“Some had difficulties logging in due to coordination difficulties, or they were not able to transcribe a six-digit number for multifactor authentication from their mobile phones on to the computer log-in page,” the report notes.
The college responded by training parents, carers and support workers to be involved in the process for the learners, which overcame this.
An unintended consequence of this, though, was that it had an “impact on the learners’ independence”, something that perhaps should have been thought about during the pre-implementation stage or during a proper pilot.
5. Training and support
As the above example shows, rolling out new technology cannot happen without training; something the report says can be achieved in numerous ways.
For example, Inset days are cited as a popular time to train staff on new platforms - and this also helps to establish the importance of any new platform being rolled out in terms of the school’s priorities.
The report notes, though, that maintaining support for staff after this is vital to ensure that early enthusiasm and upskilling do not ebb away and that further training is offered if required to get the most from any new tool.
Further to this, establishing different levels of support or training on edtech tools can help to meet the different capabilities of teachers, from those who are less confident to “power users”.
This can also identify what the report terms “learner champions”, who can use their interest in this area to take on extra responsibilities to train up peers or pupils who need extra training.
Offering the chance to learn on accredited edtech courses was also cited as a good way to upskill staff and identify those with a keen interest in this area to help in future edtech projects.
Any good examples of best practice for tools being used in the school or MATs should be shared where possible to show others how they could make the most of the technology.
6. Monitoring use and effectiveness
Finally, the report highlights the need for schools or MATs to ensure that they take a rigorous approach to monitoring the use and impact of any edtech tool.
This can not only help them to understand if it is having a positive impact on learning outcomes or improving processes but it can also identify if some users or schools are not engaging with the new tools as much as possible.
For example, the report cites a single-academy trust that rolled out new teaching and learning software to give teachers another way to engage with their classes.
Within this, the edtech lead was able to view data on who was using the software and how often it was being used. In doing so, they were able to identify teachers using it the most and appoint them as champions for that tool so they could help teachers less confident in its use.
More standard ways of monitoring use, such as learning walks and lesson coaching and feedback, can also help to identify someone whose expertise should be promoted - or those who may need more support to get the most from any edtech platform.
User feedback should not be overlooked either - from teachers, parents and pupils - to show if any future training is needed or identify problems in its use. Ensuring that there are mechanisms for feedback to be heard regularly should be considered.
Dan Worth is senior editor at Tes
You need a Tes subscription to read this article
Subscribe now to read this article and get other subscriber-only content:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters
Already a subscriber? Log in
You need a subscription to read this article
Subscribe now to read this article and get other subscriber-only content, including:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters
topics in this article