Regulating student awards

How we are monitoring the use of degree algorithms

We’re surveying universities and colleges about their use of degree classification algorithms.

If you're a provider with degree awarding powers (DAPs), we're asking you to reflect on the findings we've set out in the report and, if necessary, adjust your practice to ensure the degrees you're awarding retain their value over time.

We’re also asking any provider with DAPs to let us know by 31 July 2026 if you are intending to continue using the following in your degree algorithms from September 2026:

  • steps that discount students’ credit with the lowest marks
  • steps that take the best result from a choice of multiple algorithms.
Take the survey

Who needs to respond to the survey

You do not need to submit a response if:

  • you do not use, or do not intend to continue using, degree algorithms
  • you use degree algorithms but from September 2026 these will not include either discounting or multiple algorithms as defined in our report.

If you do intend to continue using discounting or multiple algorithms, please submit a response.

Discounting is ‘an algorithm step that includes a specified volume of module credit in the calculation to determine a student’s class of degree and automatically excludes (‘discounts’) credits with the lowest marks from the calculation.’ 

Multiple algorithms: ‘the practice that concerns us is where a provider feeds module marks for a student into two or more different algorithms and awards the student the highest class of degree determined by any of the algorithms.’

We have held some discussions with providers about whether borderline rules fall under the category of ‘multiple algorithms’ and so need to be reported in July.

We are aware that some providers have borderline rules that adopt the following general approach:

The student receives the highest class awarded out of either:

  • a simple average (e.g. ‘a first class degree will be awarded where a student has an overall weighted average of 70%); or
  • a lower average and preponderance rule (e.g. ‘a first class degree will be awarded where a student has an overall weighted average of 68% provided they have achieved 70% in modules totalling at least x credits’.

We see this sort of borderline rule as an attempt to give a fair outcome across a profile of student achievement. That means you do not need to tell us in July if you intend to continue using borderline rules of this sort. 

The rules relating to awarding the best result from multiple algorithms that we would like you to tell us about in July are rules where:

  • a student is awarded the better result from two or more algorithms
  • the weightings of the years of study or the use of discounting rules are different between those algorithms.

So, for example, the following would be an example of multiple algorithms about which you should tell us (example is for a three year full time degree):

  • A student is awarded the better result from the following two algorithms:
    • Algorithm one: year one is weighted 0%, year two is weighed 30% and year three is weighted 70%.
    • Algorithm two: year one is weighted 0%, year two is weighted 50% and year three is weighted 50%.

The following would also be an example of multiple algorithms about which you should tell us:

  • A student is awarded the better result from the following two algorithms:
    • Algorithm one: year one is weighted 0%, year two is weighted 50% and year three is weighted 50% with the 20 credits of year two with the lowest marks discounted from the calculation.
    • Algorithm two: year one is weighted 0%, year two is weighted 0% and year three is weighted 100%.

We are asking you to let us know if you are using multiple algorithms of this sort because we are concerned that choosing the better result from multiple algorithms in this way makes it harder for a provider to demonstrate compliance with condition B4. This is because it is more difficult to understand why the best outcome is always the outcome that most appropriately reflects the knowledge and skills attained by the student.

Whatever kind of algorithm your institution uses, it’s important that awards appropriately reflect the knowledge and skills of each student, and that any changes you make to those algorithms are tested using evidence of actual student attainment.

What we’ll use the survey responses for

At this stage, we’re just seeking to develop our understanding of the use of discounting or multiple algorithms in the sector. This will inform our next steps as we move forwards to DAPs reform, as trailed in our strategy consultation and the government’s post-16 education and skills white paper.

We plan to follow up as appropriate with individual institutions after 31 July to better understand their reasons for continuing to use discounting or multiple algorithms. 

We will use this information to develop our understanding of how institutions are using these steps, barriers to moving away from these steps, and how widespread their use is throughout the sector.  

We are aware that this is complex issue and that each institution will have its own view. We’ll continue to engage with universities and colleges and other sector bodies as we consider how to move forwards.

Contact us

If you have any questions about the survey, please contact Lauren Sloan at [email protected].

Published 13 May 2026

Describe your experience of using this website

Improve experience feedback
* *

Thank you for your feedback