Teachers need guidance to help them identify and manage pupils’ use of artificial intelligence (AI) and respond to cheating, according to a Department for Education report published today.
The report on generative AI in education also calls for funding to be put into researching the development of AI content detectors to support educators as the technology becomes more sophisticated.
The recommendations are contained in a report by the DfE’s Open Innovation Team, based on interviews with teachers, academics and experts in the edtech industry.
The latest DfE findings were announced by education secretary Gillian Keegan, speaking at the Bett Show today. She said that the government’s top priority in exploring emerging technologies “is the safety of young people”.
The report highlights a Teacher Tapp poll from November last year showing that almost half of primary and secondary teachers (42 per cent) had used generative AI in their role - an increase from 17 per cent in April 2023.
It warns that there is “uncertainty, concern and scepticism” about the use of generative AI in education, adding that the technology “introduces many new risks that need to be managed”.
The report calls for guidance to help teachers ”identify and manage student use of AI and respond to academic malpractice”.
The government could support this, it says, by “convening current understanding of best practice for managing malpractice to mitigate the risk of students being unfairly penalised based on limited evidence.”
The DfE report also warns that use of generative AI will become harder for teachers to identify as the technology becomes more sophisticated.
“Funding for research is needed to support the development of tools that can reliably detect AI-generated outputs, and for other initiatives that could help”, it says.
Generative AI ‘could worsen digital divide’
The report also warns that generative AI could exacerbate the “digital divide” in education.
“There is already an emerging difference in adoption of generative AI between state and independent schools,” it says.
Exam boards have previously called on the government to make sure that schools have the digital access and guidance they need for online GCSE exams.
Today’s report recommends that the government adopt a long-term strategy for the use of generative AI in education, and suggests an AI literacy initiative for young children “to help them understand their digital rights”.
In her speech today Ms Keegan said: “We should have the same expectations for robust evidence in edtech as we do in other areas of education.”
She called on edtech businesses to lead the way, be transparent with buyers, and promote products based on “great evidence of what works”.
The DfE is currently exploring how generative AI can reduce teacher workload, including providing Oak National Academy with “up to £2 million” of funding to invest in building AI tools.
However, this funding announcement met with some criticism from the sector, with unions warning that AI is not a “silver bullet” to fix teacher workload.
The report also follows government guidance setting out new technology standards for schools, which includes assigning a senior leadership team (SLT) member to be responsible for digital technology.
Today’s report uses results from the DfE’s earlier call for evidence on AI in education last year. The results of a two-day “hackathon” into AI use by schools held in October last year are due to be released in the spring.