Ofqual has set out its approach to regulating the use of artificial intelligence (AI) in assessment and qualifications.
The watchdog’s plan aims to regulate AI use to ensure fairness in assessments, maintain validity of qualifications, maintain security and enable innovation.
It comes after an exam board director warned earlier this year that the widening of access to AI tools presents a “significant burden” for teachers when marking assessments, such as coursework.
Ofqual said it has taken “steps to support safe delivery of current assessments”, and been clear about when AI does not comply with the rules that awarding organisations must follow.
Here is what the regulator is doing to manage AI use.
Can AI be used for exam marking?
Ofqual told awarding organisations in September 2023 that using AI as a sole marker does not comply with regulations.
It reached this view because using AI does not meet the requirements for a human-based judgement, and because the potential for “bias, inaccuracies and a lack of transparency” in how marks are awarded could make the system unfair.
However, Ofqual did say there are opportunities for AI to complement human marking, though further evidence will be needed before this can happen.
The exam board AQA has said it will be trialling how AI can be used to provide quality assurance for human marking in this summer’s exams.
It said GCSE and A-level data will be used to check to what extent marks given by AI match those of senior markers.
Ofqual said today that using AI as a sole form of remote invigilation is unlikely to comply with regulations, but it would be kept under review.
How will AI affect coursework?
The think tank EDSK previously warned that the rise of ChatGPT and other AI programs could make coursework and other forms of teacher assessment impossible, owing to the risk of cheating.
Ofqual admitted that non-exam assessment is “potentially placed under more pressure” from AI.
However, it said that only “modest numbers of cases” of malpractice in coursework have been identified by exam boards in initial reports.
Guidance has been issued by the Joint Council for Qualifications, which sets out in further detail best practice for teachers and assessors.
How will AI malpractice be managed?
Ofqual said it has requested information from all awarding organisations about how they are managing malpractice risks from AI.
It said it plans to evaluate this evidence and follow up where necessary.
Evidence from exam boards will also be used to introduce further guidance on AI and how to mitigate harm from its inappropriate use.
Specific categories for AI will be implemented for when exam boards report malpractice to Ofqual.
What is the future of AI in assessment?
Ofqual said it plans to “provide additional sector guidance and advice, which will further secure safe and well-considered innovative use, though if we deem it appropriate, we may add, remove or amend our rules”.
Awarding organisations can use Ofqual’s innovation tool to explore how new ideas work with regulatory requirements, including those that use AI, it said.
The regulator said it will continue its research into the opportunities for AI marking, as well as research into perceptions among students, teachers and parents of using AI in the design and delivery of qualifications.
For the latest education news and analysis delivered directly to your inbox every weekday morning, sign up to the Tes Daily newsletter