No end point in sight

Introducing a system of end-point assessments for apprenticeships has created a multimillion-pound market, but delays and discrepancies are leaving students in limbo, unable to prove what they have learned and move forward with their careers. George Ryan investigates
23rd November 2018, 12:00am
Magazine Article Image

Share

No end point in sight

https://www.tes.com/magazine/archived/no-end-point-sight

Imagine signing up for a programme of study and not knowing how long it will take you to complete it, nor how you will be assessed at the end. Or how about being stuck in limbo for 18 months after finishing your course, unable to sit the final assessment because industry regulators are still deciding what it will involve?

Even for teachers working in schools, who have faced numerous qualification reforms in recent years, such a prospect must sound like a nightmare. But this is exactly the situation that has arisen for some apprentices who have found themselves unwittingly caught up in the biggest overhaul of assessment taking place in the education system today: the end-point assessment for apprenticeships.

The complex set of reforms underpinning this situation has spawned a new industry soon expected to be worth £250 million each year. Fears are growing that the opaque quality-assurance regime sitting beside it could lead to a splintered system of assessment that lacks consistency and proper oversight - one that is, potentially, ripe for abuse.

These problems can be traced back to the government-commissioned Richard Review of Apprenticeships, published in 2012, which called for the “overly detailed” apprenticeship frameworks to be replaced with simpler standards set by groups of employers. It is six years ago to the month that US entrepreneur Doug Richard - best known to many for appearing in the first two series of the BBC’s Dragons’ Den - proposed radical changes to the way apprenticeships were run in England. Buried in his report was an seemingly innocuous recommendation that paved the way for the chaos now engulfing the sector.

Entering a minefield

Under the previous system of frameworks, Richard argued, the process of continuous assessment was “driven by paper tests and assessors with a vested interest in learners passing the test”. Instead, the report recommended, apprentices should face an end-of-programme test to “assess whether the individual is fully competent and employable within their job and their sector”. And while employers should be involved in this, the process should be led by “entirely independent” assessors and, in turn, overseen by a “government body or regulator” in a “light-touch way”.

The goal sounds laudable. But putting this vision into practice has proved a minefield for the organisations involved. Figures released by the Department for Education this month show that, of the 325 apprenticeship standards currently approved for delivery, 89 still do not have an end-point assessment organisation in place. And even where the assessment process has been finalised, the approach taken varies significantly for each individual apprenticeship (see below).

The Institute for Apprenticeships - the body set up in 2017 to advise on funding levels and set quality criteria for apprenticeship standards - insists that, in most cases, apprentices end up being able to complete their training without a hitch. But across the further education sector, there are widespread concerns that the problem is far more serious than the institute acknowledges.

Since the first apprenticeship standards were introduced in September 2014, a whole new assessment industry has sprung into being. To date, 185 organisations have been approved to deliver apprenticeship testing on the Education and Skills Funding Agency’s formal register. By August 2020, all apprenticeship frameworks will have been phased out, and all apprentices will start on the new employer-designed standards - which include an end-point assessment.

Tom Bewick, chief executive of the Federation of Awarding Bodies (FAB), says this is still a nascent market but, eventually, 500,000 assessments could be carried out every year: “Assuming that apprenticeship volumes recover to the pre-levy level of around 800,000 starts per annum, we could be looking at an end-point-assessment market worth around £250 million a year,” he adds.

“England is the only other country in the world, after Canada, to have adopted an end-point testing regime in its apprenticeship system,” Bewick continues. “It is the right approach, since it will provide an objective assessment over time of how proficient, skilled and competent the next generation of occupational professionals will be. In the end, that should help raise our productivity as a nation, making us all better off.”

‘Culture shift’ required

It is still early days for the apprenticeship standards model: there have been 28,400 starts since September 2014 with 23,700 of these in the 2016-17 academic year alone. Despite the system having been up and running for four years, however, fewer than 5,000 end-point assessments have taken place so far. Bewick says a “huge culture shift is required”, as the biggest challenge is around growing assessor capacity and capability by attracting people with industry-relevant experience to train as qualified assessors.

Out of the 185 end-point assessment organisations on the government’s register, 51 are awarding organisations. Kirstie Donnelly, managing director at City & Guilds Group, which carries out final assessments for 47 apprenticeship standards, says signing up has brought its challenges.

“We entered this market early on, as we saw end-point assessment as a vital part of an apprentice’s journey. But we had to invest significantly up front to develop our offer and put in place new resources, processes and technology,” she says. “We are starting to see end-point assessment organisations working together to resolve some of the issues they face - at a standards level but also at a policy level.

“The government has always intended that employers will make the final end-point assessment organisation decision, but it is still being made largely by the training provider and we don’t see this shifting in the near future, despite the decision being such a crucial one for apprenticeship success. Therefore, the importance of successful collaboration and partnership between the end-point assessment organisation, the provider and employer is critical.”

Graham Hasting-Evans, group managing director of awarding body NOCN, says this type of assessment is markedly different from his organisation’s bread-and-butter business. For vocational qualifications, learner achievement is measured during the training period through continuous assessment. In many cases, awarding organisations undertake a sample check of a cohort and then confirm the training centre’s view of attainment for the whole cohort.

End-point assessment, says Hasting-Evans, is fundamentally different: “Everyone is end-tested in an examination environment to an apprenticeship standard set by the employers using assessment instruments - or exams - also decided by the employers.”

There is more of a logistical burden, too, for assessment organisations - something that the FAB calls the “Penzance question” in its Rationale, Readiness and Risks report. A far cry, then, from qualifications for which the education provider organises the physical location for an examination; instead, it is up to the awarding organisation to find the site where testing will take place. In rural and remote areas - such as Penzance in Cornwall, for instance - where there may be only a couple of apprentices, this can present a challenge, but one that the FAB’s research finds is being met through the use of regional hubs or technology.

Another area of difference is around quality assurance. Unlike other qualifications, for which regulator Ofqual carries out this process, as of October, external quality assurance for end-point assessment was being performed by no fewer than 47 separate organisations. This has led to accusations from industry groups that “everyone is singing from different hymn sheets”.

‘Huge headache’

According to Suzanne Offer, regulation manager for the Institute of Motor Industry, the quality-assurance system is “a huge headache”. The institute is an end-point assessor across six apprenticeship standards and has been asking for clarity around external quality assurance for almost two years, she says.

“End-point assessment organisations don’t want to invest in something or build it up [only] for it to change in a few months,” Offer says.

Despite the large number of quality-assurance bodies, the biggest by far is the Institute for Apprenticeships, which currently quality-assures more than 162 apprenticeship standards through its contractor, Open Awards. But this was never meant to be the case. Official documents dating back to the time the body was established make clear that it was only intended to step in as a “last resort”.

The apparently unintentional expansion of its role highlights the difficulties that awarding bodies working in the sector face. Hasting-Evans says the quality-control process is “inefficient, ineffective, inconsistent, and a waste of taxpayers’ and [apprenticeship] levy-payers’ money”.

The chief executive of the Association of Employment and Learning Providers, Mark Dawe, believes Ofqual - which currently quality assures 50 apprenticeship standards - should be put in charge to ensure stability.

“I get that everyone is working hard to make this work, but the Institute for Apprenticeships needs to do more to sort this out,” he says. “There is no transparency coming from them. One thing they could do is give the quality-control process to Ofqual - they could do the job and they could get started really quickly.”

However, Neil Robertson, chief executive of the National Skills Academy for Rail, which provides quality assurance for eight apprenticeship standards, says industry bodies can help to enforce safety standards critical for the workplace. “We thought the current system was not up to scratch so we decided to offer [quality assurance] ourselves,” he adds. “We will say, ‘You don’t get away with this if you don’t do it right.’ ”

Maintaining quality is important for the integrity of the new apprenticeships system, says the FAB’s Bewick. “End-point assessment organisations must remain independent of training providers, otherwise problems around collusion and conflicts of interest will abound. The key principle in our new apprenticeship model is that providers can no longer mark their own homework.

“Over time, end-point assessment, done properly, is going to significantly shake up what we mean by quality in the provider delivery marketplace. Because not only will Ofsted take a view but the data shared by end-point assessment organisations will give a picture of how well the training is equipping the apprentices. Pass and failure rates will be scrutinised carefully.”

The DfE, though, insists that there is little to be concerned about. A spokesperson says: “99.97 per cent of apprentices who are expected to reach their gateway review within the next 12 months are covered by an end-point assessment organisation. These are a crucial aspect of apprenticeship reforms, giving employers assurance that apprentices have been independently assessed as job-ready at the end of their programme.”

Sir Gerry Berragan, chief executive of the Institute for Apprenticeships, is also at pains to stress that “only a small handful” of apprentices are due to finish their courses in the next 12 months without an assessment in place. He adds: “End-point assessment is a new and important feature of apprenticeships and we know that employers, providers and assessment organisations are working hard to adapt to this and deliver a high-quality service to apprentices.

“We are aware that this can provide challenges. We have recently reviewed our processes and are working with employers to ensure all apprentices can undertake end-point assessment at the appropriate time.”

Donnelly from City & Guilds also believes that, despite the difficulties, the current approach is worth sticking to. “It’s essential the environment is right for quality end-point assessment organisations to continue building their end-point assessment offer so that all apprentices can benefit from this vital step,” she says.

George Ryan is an FE reporter for Tes. He tweets @GeorgeMRyan

How are apprentices assessed?

The end-point assessment can take various forms: exams, professional discussions, workplace observation, portfolios of work, assignments or assessment of work output, to name but a few.

Since the apprenticeship reforms gave employers the power to design the new standards, assessment methods are up to those involved in the industry’s trailblazer groups. The types of assessment, therefore, vary across different apprenticeships.

For example, the assessment for a level 3 advanced baker apprentice includes three elements: a 90-minute knowledge test, a three-hour observation and an eight- to 10-piece portfolio of projects, and a 20-minute presentation.

A level 3 apprentice spectacle maker, on the other hand, must complete both an observation of the quality of their technical skills and a one- to two-hour “professional discussion” with an assessor to establish their understanding and application of knowledge, skills and behaviours.

Assessment with no teeth

Delayed end-point assessment can leave apprentices stuck without certification to show they have completed their training; this in turn can prevent them from moving on in their jobs.

In one case, nine level 4 dental practice apprentices have been waiting 18 months to take their end-point assessment. The apprentices at Barnet and Southgate College in North London were meant to have their assessment in May 2017, but were told it was not yet ready, and it has since gone back to the drawing board. Principal David Byrne says these delays are putting apprentices’ lives on hold.

“We’re talking about apprentices coming from small- and medium-sized enterprises,” he says. “These employers and employees are unable to move ahead in their careers. After 18 months, there is a risk that what they learned will no longer be relevant to the assessment.”

Byrne says the situation is embarrassing for the college. “The apprentices don’t understand how we can deliver the apprenticeship but not have the assessment sorted out beforehand,” he explains.

Ultimately, it is the learner who loses out, according to Mark Dawe, chief executive of the Association of Employment and Learning Providers: “It is really unfair on the apprentices, who are stuck waiting for their end-point assessment.

“It should be available straightaway when they are ready to take it.”

The design of the system is at fault for causing the delays, Dawe says, adding: “All the rules for assessment are different for each apprenticeship standard. Someone at the centre needs to say this is ridiculous.”

The apprentice’s experience

Muminah Rasul says she felt like a “guinea pig” at times being part of the first cohort on a new events-management apprenticeship.

“When I started, there wasn’t an end-point assessment in place, but [my college] was in talks with various people to get this all sorted,” Rasul says. “Being an 18-month apprenticeship, they knew it would be sorted before then, as there were already discussions happening.

“There is now a test confirmed and we have been discussing the process of what will happen when I finish in a couple of months’ time.”

Despite the test not being ready when she began, Rasul, who is also a representative for the National Society of Apprentices (part of the NUS students’ union), says the process has been relatively smooth in her case.

She adds: “Some things have changed along the way but, again, this hasn’t been an issue because all of this was always going to happen at the end anyway.”

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared