Regulating student awards
Ensuring awards reflect students’ achievements
We have been concerned for some time that improved teaching may not fully explain the scale of the increase in the proportion of students awarded first-class and upper second class degrees over time.
To support providers, we have published a report setting out things that institutions should consider when making changes to the algorithms they use to determine bachelors’ degree classifications.
This includes testing the classes of award given to students against their actual attainment, and taking reasonable steps to ensure any award accurately reflects their academic achievements.
The report also discusses our concerns about the use of two steps sometimes included in an algorithm: awarding a student the best result from multiple algorithms and discounting credit with the lowest marks.
Case reports
We have also carried out detailed assessments of the changes three universities have made to their classification algorithms for bachelors’ degrees over time.
Questions about the report
Since we published the report on degree classification algorithms, we’ve been engaging with the sector on this topic. Some of the common questions we’ve been asked, and our responses are:
No, but we think the use of these steps in algorithms is likely to place an institution at increased risk of not being compliant with condition B4.
In our November report we set out our view is that institutions should move away from using these steps when designing or changing their degree algorithms.
We have had several recent discussions with the sector about the use of discounting of credit with lowest marks. Institutions have explained how they are using this as a way of offsetting the impact of capped marks on the classes of degree awarded. The capped marks are due to failure at the first attempt in assessment and some students from underrepresented groups have disproportionately high number of capped marks.
We recognise there are important issues to consider here. We support institutions’ commitment to access and to successful outcomes for students from underrepresented groups and recognise that institutions think there are benefits of discounting in supporting these students. However, our view is that institutions also need to be able to demonstrate that classes of degree awarded to all students appropriately reflect the knowledge and skills of students.
Where institutions decide to continue to use discounting or multiple algorithms we are more likely to engage with them to understand their reasons and the evidential basis for their position.
We’ve also set out a calibration exercise in Annex C of our report that an institution could use to demonstrate the link between class of award and the knowledge and skills of students.
We have agreed with the three providers we have investigated that they will review their use of these steps. We will also consider the outcomes of their reviews and may use them to inform further discussion with the sector about their use. You can read the case studies above.
Yes. To carry out a calibration exercise, an institution needs to form a view about an individual student’s overall attainment so it can compare this with the degree classification awarded.
Reviewing the student’s actual work would be the best way to do this and would give the highest degree of confidence in the findings of the exercise.
However, if you cannot access some or all of the relevant student work, you may need to rely more heavily on mapping statements of expected assessment level or module level attainment, such as marking criteria that describe expected performance at each degree class for a given assessment or module.
In the absence of being able to review the student’s work, institutions could use such assessment or module level documentation to estimate whether an individual student with a particular profile of marks would likely demonstrate the knowledge and skills required for the class of degree they were awarded. Institutions should take the most robust approach possible.
We previously published supplementary guidance on the retention of assessed student work.
We recognise that this is a complex issue and it’s important that institutions continue to treat students fairly. We have not set out a formal requirement for providers to stop using discounting or multiple algorithms, but we’ve been clear that we see the use of these steps as high risk for non-compliance with condition B4.
If you identify that your institution may be at increased risk of not being compliant with our conditions due to its academic regulations, we think it’s important you take action to reduce that risk as quickly as possible.
In doing so, institutions must also comply with consumer law and ensure they are treating students fairly.
If you would like to make changes to the algorithms your institution uses to classify degrees but feel unable to do so before September 2026 because of your contractual or consumer law obligations, you can let us know this by 31 July 2026. We might ask you to provide evidence that shows this.
Contact us
If you still have questions about the report or what we’re asking universities and colleges to do, contact Lauren Sloan at [email protected].
Last updated 13 May 2026 + show all updates
13 May 2026 - We added answers to common questions about our report on degree classifications algorithms.
Describe your experience of using this website