Ofqual Investigation into OCR's Summer 2014 GCSE and A Level Marking Issues.


Ofqual has published a report following its investigation of marking issues identified at the exam board OCR last summer. The regulator’s investigation followed concerns about OCR’s ability to mark all of its 2014 GCSE and A level papers on time.

Ofqual’s investigation into the handling of GCSE and A-Level results said the near miss was caused mainly by problems with an online marking system. Last year’s problems at the exam board occurred after it decided to move to a 100% online marking system and tightened up some of its processes.
Main Findings:
Ahead of marking and awarding summer 2014 GCSE and A level papers, OCR had a number of significant and known risks not present in other recent summer series.
First, like other exam boards, in 2014 OCR offered no January series and re-sit opportunities for GCSE and A level. This linearisation meant that OCR had approximately 900,000 more scripts to mark in summer 2014 than in previous series.
Second, OCR had introduced an additional level of monitoring for summer 2014 and, as such, had anticipated that this would result in more assessors being stopped from marking than in previous summer series.
Third, OCR introduced a more robust standardisation process intended to remove inconsistencies across the different stages of the standardisation process. OCR’s final review report identified that its estimation that this new process would extend the timescales by two to three days was an underestimation and that in most cases it took up to a week to clear assessors through standardisation (p 42, AS09).
Fourth, in summer 2014 OCR moved to 100 per cent e-marking for the first time and implemented a new web-based version of their electronic script marking system (scoris web assessor). There were certain functionality issues with scoris web assessor that contributed to OCR’s subsequent significant marking issues later on in the marking window.
Fifth, OCR had been involved in an intensive restructure programme over the preceding 12 months. As OCR’s preliminary review report and transition delivery review reveal, the time and resources spent implementing the restructure, plus uncertainty among staff, had limited the effect of contingency planning undertaken for events such as the scoris web assessor performance issues.
OCR also introduced a new governance structure in January 2014 that emphasised the management of accountabilities, including business planning for each business area. Also, and crucially, the restructure had removed the qualification manager structure (approximately 120 personnel) who had previously been the primary operational interface with assessors during the marking period. The qualification managers were replaced by a new Examinations & Assessor team that had responsibility for monitoring and managing the quality and quantity of marking and determining what interventions and actions were needed. This was in conjunction with the 21 chairs of examiners who now had an enhanced role in recruiting and managing senior assessors. The change to the qualification manager structure took effect from January 2014.
The restructure transition was signed off as complete at the end of March/early April 2014. OCR commissioned a post-implementation review to look at the effectiveness of the Transition Delivery Group. We received a copy of the Transition Delivery Review report on 3rd November 2014. Relevant evidence from this review has been included in this report.
The executive summary of OCR’s final review report stated: “OCR came very close to missing major external deadlines during the summer 2014 series. OCR demonstrated characteristic resilience in dealing with the summer’s problems. Those problems should never have arisen.”
The critical findings of our investigation, which run through this report are as follows:
  • OCR’s restructure impacted on the summer 2014 series. In particular, the qualification manager structure was removed without a full understanding and mapping of how qualification managers interfaced with assessors – including the support qualification managers provided to assessors during marking and managing the marking itself.
  • OCR had limited understanding of assessors’ availability.
  • There was fragmented governance and a lack of clarity in key roles and responsibilities in managing marking.
  • There was a lack of understanding among key senior managers of the end-to-end process of marking and awarding. This affected their ability to see the link between the scoris performance issues early in June and the potential for its impact on marking shortfalls and hitting marking deadlines.
  • There was no cross-business contingency planning and risk identification.
  • Senior managers were working with flawed understanding and assumptions.
  • OCR’s third-party control within a syndicate structure context was a contributing factor to some of the issues identified.