Inside the trial that could make digital exams a reality

Last year, as students in England sat their traditional English GCSE, those in international schools were sitting the same exam as a digital, computer-based test in a pilot that offers an intriguing insight into the future of digital assessment
24th March 2023, 5:00am
Exam, cursor arrow

Share

Inside the trial that could make digital exams a reality

https://www.tes.com/magazine/analysis/secondary/digital-exams-assessment-pilot-schools-gcse-alevel

Pens down, stop writing. Your time is up.”

These are the words that ring out in every school hall during every exam season - but could this soon be a thing of the past?

Last summer, while students in England filled in their paper scripts the traditional way, students in Bahrain, the United Arab Emirates, Qatar and Spain sitting the English language iGCSE took their test entirely online using a computer, as part of a pilot run by exam company Pearson Edexcel.

“Digital assessments feel like an exciting next step,” says Hayley White, head of assessment at Pearson and the person who oversaw this pilot, talking to Tes about how it all went.

These tests were the real deal for these students - there were no back-ups nor any “it’s just mocks” fallback if something went wrong.

As such, the trial represented a big step towards digital exams being rolled out more widely - and eventually becoming the norm in schools in the UK.

But did it go well enough to make that objective a reality any time soon? We’ve been given an exclusive insight into the trial to see what went well and where there were challenges. 

How did the digital exam pilot work?

Rather than having the traditional set-up of a printed booklet of extracts to read and an answer booklet to write their responses in, students were sat at a laptop or desktop computer, with a mouse and a keyboard and access to the bespoke software built by Pearson for the tests.

This software (as yet unnamed, but referred to as “test player” by Pearson) mimicked the usual exam booklet structure, but with a split screen that displayed a “booklet” of texts on the left and the “answer booklet” on the right, into which candidates typed their answers.

Candidates could also highlight text, copy and paste sections, use “digital Post-it notes” to record their thoughts and even change the colour and style of the font and the background colour.

For White, the key was to effectively give students all the same capabilities they may have in a paper exam, but in a more accessible and less cumbersome way.

“The technology is there to mimic what modifications students are already allowed in exams - but it can do it more discreetly,” she adds.

Furthermore, to eliminate the risk of online cheating, the moment the Pearson exam software is loaded, the device’s access to the internet is blocked and schools that took part had to change their invigilation practices to have one invigilator for every 20 students, rather than the standard 30.

“The extra invigilators are what the Joint Council for Qualifications stipulate for on-screen assessments,” explains White. “They are necessary in order to ensure thorough supervision.”

Finally, the schools involved were chosen, White says, because they had already shown a willingness to be involved in earlier trials of digital exams - so Pearson knew they would be committed to making it work while offering a “critical perspective” of the experience and identifying the “practical challenges that needed working through” for future digital exam rollouts.

What did schools make of the digital assessment?

Nick Cairns, assistant head and exams officer, King’s College, Madrid:

All our students work on a laptop rather than exercise books, so moving to computers for exams felt like a natural and logical step, and it was no effort to sell the idea to students.

In many ways, the two methods of assessment are very similar: both run for two hours and 15 minutes; they are undertaken in silent exam conditions; students are positioned to prevent “looking over the shoulder” cheating.

But the capabilities allowed on the computer assessment really appealed to our students. For example, changing the size of the font was really useful for students with additional educational needs who required larger fonts.

What’s more, when it is sent to the examiner, it all reverts to standard typed text. In a real exam, they can’t hide mistakes. Every crossing out or reordered paragraph is obvious on a paper exam.

Students also told me they liked being able to see the text side by side, rather than flicking back and forth in a booklet. We had a few students who came out of the exam and said they wished all their exams could be taken on a computer.

Wayne Ridgway, headteacher, British School of Bahrain: 

All pupils from Year 5 upwards were already using computers for all our internal assessments. However, initially some staff were sceptical. Mostly, they wanted reassurance that students would have the final choice of doing the exam on paper or on a computer, and, of course, we offered this choice. However, all 162 of our iGCSE English students choose the computer exam.

We then set aside lesson time to allow students to get used to the interface and how it all works and to spend time with staff explaining the logistics.

It wasn’t just staff and students we had to communicate changes to, either - we had to think about parents, too, and we held several meetings to answer questions and keep them informed.

Digital exam


For the exam itself, children used laptops in the sports hall, with power leads connecting each device to ensure there were no flat-battery disasters. Students also sat spaced out at exam desks to ensure no screen was overlooked.

Starting the exam took a lot longer than usual because we staggered everyone logging in row by row to avoid overwhelming the network.

The tech support from Pearson was on hand to resolve any issues, and any time lost due to tech issues was added on at the end for those students by Pearson, so no one was disadvantaged.

For example, we had one student who forgot to submit their paper at the end - but the tech team retrieved and submitted it.

Katie Templar, deputy head, Qatar International School:

We were involved with troubleshooting and sorting out teething problems with the software before the exams. For example, when we were first trying out a mock exam on the platform, there was a glitch with the timer and the exam stopped short. Flagging this up early meant that in the real thing, the exam went ahead without any hitches.

Of course, such issues understandably made some staff nervous and a little hesitant because the old-fashioned route of using papers can seem “safe”. 

With computers, there is the worry that something will malfunction or run out of charge - that’s something you don’t worry about when working on paper.

‘It felt like a natural and logical step, and it was no effort to sell the idea to students’

The students were far less cautious, as shown by the fact that when we offered them the choice of the digital route or the traditional pen-and-paper route, 92 per cent opted for the former, a total of 127.

In fact, many said they were actually relieved at the prospect of a typed exam. They had been more worried about doing a written paper as lockdown learning meant they had been out of practice using a pen and paper. They also said they were able to get more of their ideas down due to typing faster than they can write.

How did the students get on?

While specific schools’ results are not being shared, White says the pilot was a “real success” and a report produced by Pearson notes that students’ results weren’t hindered by the computers.

“[Using] both statistical and qualitative analysis, [the results show that] the performance of students on screen was comparable to those who sat the paper in a traditional style,” she says.

Ridgway certainly feels that opting for digital over paper had no negative effect on results.

“All our students hit their target grades using the digital exams,” he says. “We can see we were actually even slightly up on previous years for value added.”

However, White says this does not mean that Pearson considers the pilots an “overnight success”, and she admits there is a lot of work to be done if digital exams are to be used in the domestic GCSE market, and across all subjects.

After all, scaling up from a few international schools to a whole nation’s assessment regime means any teething issues at pilot will become exponentially more complicated.

For example, digital poverty - the term used to describe students who don’t have access to technology at home - and inequality in edtech provision in schools are issues that many have raised.

Qatar is very privileged and the school can afford to have devices for all students - would this be possible in England?” asks Templar.

“Would schools in England have enough suitable IT support? Would [digital exams] hamper students who don’t regularly learn through their computer and wouldn’t have IT literacy that our students have?”

Ridgway echoes these thoughts, saying: “Access to devices is a challenge in England, and this is something the pandemic highlighted.”

‘Power leads connected each device to ensure there were no flat-battery disasters’

The term for this is “digital divide”. Carl Cullinane, director of research and policy at charity the Sutton Trust, published a paper that highlighted the digital divide experienced by young people, which was cited in a Department for Education report into education technology for remote teaching.

Cullinane says that, although he thinks “digitising exams is inevitable”, we need to consider the “consequences for those without easy access to devices outside of school”.

“Any move to digital exams would need to be accompanied by a guarantee that all students would have access to appropriate equipment,” he says.

Part of the move to online exams will also involve management of the equipment, and the expense incurred from doing this will no doubt pose challenges for schools.”

In a report by Pearson, in which it reflects on the results of the pilot, it acknowledges that this is an issue that needs addressing, noting in its recommendations that there would need to be an “audit of the existing technological infrastructure in schools for delivering on-screen assessment”.

Furthermore, the school leaders who took part in the pilots attributed some of their success to the fact that students were already using computers as part of their standard classroom practice.

“All our students have their own laptops and were already working on computers rather than paper,” says Cairns. “If you weren’t already doing this then it could be a more difficult transition.”

Digital exam


If digital exams were to be rolled out nationwide in England, it would have a huge impact on day-to-day classroom teaching, as Cat Scutt, director of education and research at the Chartered College of Teaching, explains.

She says it wouldn’t be fair to ask students to complete a high-stakes assessment on a computer if using a keyboard feels “unusual or unfamiliar”.

Consequently, she says “moving to digital exams will require more frequent classroom use if students are to be capable of using the necessary equipment in the exam hall”.

Michael Walker, former executive director of the Qualifications and Curriculum Development Agency (QCDA), agrees that the impact on day-to-day teaching cannot be overlooked.

“Any new approaches will have consequences in an already busy curriculum offer in our schools and colleges,” he says.

For example, if typing exams does become the norm then preparing students for that would have to be built into education far earlier, because otherwise “those with good keyboard skills will find technology-based examinations more straightforward”.

Pearson’s report acknowledges this and it notes that, in a Teacher Tapp survey of 5,000 secondary school teachers about preparing students for online exams, a third of teachers said their learners lacked the necessary digital skills.

“[We would need] a thorough examination of how the curriculum can better provide learners with the necessary digital skills,” adds White.

‘Any move to digital exams would need to be accompanied by a guarantee that all students would have access to appropriate equipment’

As well as becoming proficient typists, students would also have to be comfortable using computers to outline their thinking and thought processes, something that Walker says is very different to paper and pen.

“The medium is the thought process,” he explains. “When students type a response, the way they work and edit is very different to handwriting an essay. They will need more practice in order to be prepared for working in that way.”

This, in turn, poses questions about how long an exam might need to be, as Ridgway outlines.

“When this is rolled out for all students taking GCSEs, I think we need to think about how long each exam is, and are we adapting the approach along with the adaptations to technology? Can we utilise artificial intelligence, for example, and use adaptive questioning?”

This is a point that Walker believes is worth further investigation, not least because of the mental and physical strain of sitting a series of long screen-based exams over a short period.

“It’s very different from a normal day because in a classroom you naturally get breaks as children move rooms,” he says.

“You don’t have the exam situation whereby you would potentially be staring at a screen for an extended period. This, again, needs further consideration; maybe an approach utilising shorter exams.”

Digital exam


Ridgway also says that this is especially important for students with special educational needs and disabilities (SEND).

“It will be essential to have longer gaps between exams to avoid extended stretches working at the computer. We also need to make sure that schedules avoid back-to-back exams, which would also help students with SEND to refocus between their exams,” he adds.

Get this right, though, and he says the overall benefits of digital tests for students with SEND could outweigh the issues.

“We are an inclusive school with students who have varying needs and access arrangements, so for that small number of students who already worked on laptops, it made them feel more included as part of the cohort,” he says.

When, not if…

It’s clear, then, that there are lots of important questions still to be answered around digital assessments.

Yet it also seems clear that a move to digital assessments being included in the exam season is a distinct possibility in the years ahead - especially after Ofqual announced last May that it was exploring how GCSEs could work as online assessments, something the government is watching closely.

“Ofqual has committed, as part of its corporate plan, to exploring the potential opportunities and implications of digital assessment in qualifications,” a Department for Education spokesperson said.

“We welcome the work it is doing in this area and look forward to working together with it on this.”

How far-reaching that work will be remains to be seen: Ofqual chief Jo Saxton said recently that while she has a “whole team” working on technology in assessment, “we’re not seeing the end of desks and pens and paper any time soon”.

Clearly, though, the exam providers think it is worth the time and effort to develop, test and trial digital assessments. For example, AQA is beginning trials for shorter online English language exams and OCR is trialling online versions of iGCSE English language and AS history.

In fact, Walker says that, with all these different trials taking place, “regulators will need to play a key role” to ensure the “standardisation in the format and transmission of computer-based assessments” so that staff and students can adapt to any evolution in exams.

That may be a few years off but - given that for some schools and their students, digital assessments are already their frontline reality - it seems reasonable to imagine that it won’t be too long before, in some exams at least, a very different phrase is heard at the end.

“Keyboards away, stop typing,” perhaps?

Grainne Hallahan is senior analyst at Tes

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

topics in this article

Recent
Most read
Most shared