Key performance measure 9
Value for money
KPM 9 provides a set of measures, using student survey and outcomes data, which can be used to consider value for money in higher education.
KPM 9A: Percentage of undergraduate students who say that university offers good value for money
KPM 9B: Percentage of undergraduate students responding positively to National Student Survey (NSS) questions about aspects of quality
KPM 9C: Proportion of students at providers with student outcomes indicators above our numerical thresholds
Many of our strategic goals relate directly or indirectly to the value for money of higher education for both students and taxpayers.
Previous research commissioned by the OfS suggests that students’ perspectives on value for money are primarily driven by the quality of teaching, assessment, feedback, and learning resources. Furthermore, the research shows that a significant proportion of students also value positive employment outcomes and earnings after graduation.
When students complete their courses and secure positive employment outcomes, this also represents value for money for taxpayers, who support the higher education system.
It is not possible to encapsulate all these aspects of value for money in one measure. KPM 9 presents a set of measures that, taken together, can be used to consider value for money in higher education.
KPM 9A presents data from polling commissioned by the OfS, which asked students about their views on value for money.
KPM 9B presents data from the National Student Survey (NSS) about the aspects of quality that previous research suggests are important to students in considering value for money.
KPM 9C presents data on student outcomes. This shows the proportion of students at providers with student outcome indicators above the numerical thresholds we have set.
If our interventions to improve quality and student outcomes have the desired effect, we would expect KPMs 9B and 9C to increase. This should also be reflected in improved student perceptions of value for money, as measured by KPM 9A.
KPM 9A shows that our 2023 polling of students in England suggests that around 52 per cent of current undergraduate students think university offers good value for money, considering the costs and benefits. This is an increase from 46 per cent in 2022.
However, this result should be interpreted with caution for several reasons.
The polling company we commissioned surveyed only a subset of current undergraduate students. It is possible that the views of this subset differ from the views of students more generally, simply because of random variation.
There is also a chance that the method of data collection (surveying a sample of volunteers) introduces bias.
The 2022 and 2023 data was collected using a different polling company to previous years, with a different panel of students. This change means that the results from 2022 onwards may not be comparable with previous years.
The Student Academic Experience Survey (conducted by Advance HE and the Higher Education Policy Institute) also measures students’ perceptions of value for money. In 2023, the survey found that 37 per cent of undergraduate students in England reported higher education to be ‘good’ or ‘very good’ value for money. This was a slight increase from 34 per cent the previous year.
KPM 9B shows the percentage of students who responded positively to NSS questions about the teaching on their course, the assessment and feedback they received, and the learning resources available to them.
Results from 2023 are shown separately as they should not be directly compared. In particular, the removal of the ‘neutral’ response option in 2023 means that we would expect more students to respond positively to the new survey, regardless of any change in the student experience.
While the three theme measures used for the 2023 results have the same titles as the earlier theme measures, for the 2023 data they are published as experimental statistics. This means there is not enough evidence to be sure that they are the best way of summarising the NSS results. Therefore, these theme measures are subject to change in the future.
In the 2023 NSS, on questions about the teaching on their course, 84.7 per cent of students responded positively. On questions about assessment and feedback, 78.0 per cent of responded positively. On questions about learning resources, 86.1 per cent responded positively.
KPM 9C shows that the proportion of students at providers where the relevant continuation indicator is above our numerical threshold (at 95 per cent statistical confidence) is 90 per cent for the most recent year. This increased from 86.1 per cent over the previous three years.
For completion, KPM 9C shows that the proportion of students at providers where the relevant indicator is above our numerical threshold (at 95 per cent statistical confidence) is around 88.7 per cent for the most recent year. This increased from 86.2 per cent over the previous three years.
For progression, KPM 9C shows that the proportion of students at providers where the relevant indicator is above our numerical threshold (at 95 per cent statistical confidence) is 82.5 per cent. This decreased from 86.6 per cent over the previous two years.
KPM 9C is subject to potential volatility that can affect year-on-year comparisons. If a provider’s performance is close to the numerical threshold, random statistical variation may mean that its indicator value moves above or below the threshold in different years. In some years we may have 95 per cent statistical confidence that its indicator value is below our numerical threshold and in other years we may not. This might mean that a provider is included in the KPM 9C data in some years and not in others, even where there is not a material change in its performance. This can lead to year-on-year variations, which may be more marked if large providers are included in only some of the of the years shown in KPM 9C. For future iterations of KPM 9C, we will consider how we can communicate the effect of statistical uncertainty on the measure.
For KPM 9A, our polling company collected 668 responses between 21 March and 6 April 2023 from UK undergraduate students in England to the question 'Considering the costs and benefits of university, do you think it offers good value for money?'. This measure is based on the per cent of respondents that answered 'Yes'.
For KPM 9B, the NSS surveys undergraduate students in the spring of their final academic year. The 2023 NSS achieved a 71.3 per cent response rate in England, with 290,706 students taking part in the survey.
The positivity measure shows the percentage of respondents at English higher education providers registered with the OfS, who gave a positive answer to three theme measures in the NSS: the teaching on my course; assessment and feedback; and learning resources. The positive answers are the first two possible answers to each question.
The theme measures are positivity measures calculated for groups of questions. For example, the theme measure for “teaching on my course” summarises into a single value the positive responses to all four of the questions within this theme. For more information on how theme measures are calculated, see ‘How do you calculate theme measures’ in About the NSS data.
The continuation, completion and progression indicators used in KPM 9C are the same as those used in our regulation of student outcomes. The indicators are calculated from individualised student data from the Designated Data Body Student Record and Student Alternative Record, as well as the Individualised Learner Record from the Education and Skills Funding Agency (ESFA). OfS-registered providers submit data as required to the relevant data body. A subset of providers not registered with the OfS also submit data to these data bodies. The progression measure also links this individualised student data to survey responses from the Graduate Outcomes survey.
The first four years of data (three years for progression) shown for this KPM are calculated using data originally submitted and signed off by the provider's accountable officer, or approved data amendments signed off by 20 July 2022. The fifth year of data (fourth year for progression) shown for this KPM are calculated using data originally submitted and signed off by the provider's accountable officer, or approved amendments signed off by 5 May 2023. This means that previously published KPM values remain unchanged by recent data amendments affecting earlier years of data.
Continuation outcomes are measured by identifying a cohort of entrants to higher education qualifications and following them through the early stages of their course to track how many continue in active study or qualify one year and 15 days after they started (two years for part-time students).
Completion outcomes are measured by identifying a cohort of entrants to higher education qualifications and following them through subsequent years of their course track how many continue in active study or qualify four years and 15 days after they started (six years and 15 days for part-time students).
Progression outcomes are measures as the proportion of Graduate Outcomes survey respondents that reported they have progressed to professional or managerial employment, further study or other positive outcomes, 15 months after gaining their qualification. The data is restricted to UK-domiciled qualifiers.
Further information about the indicator definitions can be found in the document ‘Description of student outcome and experience indicators used in OfS regulation’.
KPM 9C calculates the proportion of students taught at OfS-registered providers with student outcome indicators that are above our numerical thresholds, based on indicators for which at least 95 per cent of the distribution of statistical uncertainty falls above the relevant numerical threshold. The numerical thresholds for each measure, mode and level of study can be found in the document ‘Setting numerical thresholds for condition B3’.
The proportion of students is calculated separately for each student outcome measure, by considering the headcount number of students in the populations of those indicators above the numerical threshold, divided by the total headcount number of students in the population for that measure across all OfS registered providers. The numbers of students with outcomes above numerical thresholds are calculated separately for each mode and level of study before being aggregated to give an overall proportion for each measure.
KPM 9C is based on the population of students primarily studying in the UK taught by OfS-registered providers. This encompasses students that are registered with and taught by an OfS-registered provider, or (where data is available) are taught by an OfS-registered provider under a sub-contractual partnership arrangement with a different provider.
KPM 9C does not take account of judgements we may make about whether an individual provider has met our minimum requirements for student outcomes. Our approach to making such judgements is set out in regulatory advice 20.
The first four years of data (three years for progression) shown for this KPM include providers who were registered with the OfS as at 30 September 2022. Later years of the time series include providers who were registered with the OfS at the point of publication of the summer release of student outcomes data. This means the number of providers included in each year will vary.
KPM 9C is subject to potential volatility that can affect year-on-year comparisons. If a provider’s performance is close to the numerical threshold, random statistical variation may mean that its indicator value moves above or below the threshold in different years. In some years we may have 95 per cent statistical confidence that its indicator value is above our numerical threshold and in other years we may not. This might mean that a provider is included in the KPM 9C data in some years and not in others, even where there is not a material change in its performance. This can lead to year-on-year variations, which may be more marked if large providers are included in only some of the of the years shown in KPM 9C. For future iterations of KPM 9C, we will consider how we can communicate the effect of statistical uncertainty on the measure. For further information about the effect of statistical uncertainly on the measure please see the ‘assessment of uncertainty’ document (PDF).
If you have any queries, feedback or suggestions about KPM 9, please contact Mark Gittoes at [email protected].
Last updated 09 October 2023 + show all updates
09 October 2023
- Updated KPMs 9A and 9B to include NSS 2023 data.
08 August 2023
- Annual update to KPM 9C data.
03 November 2022
- KPM 9C published
Describe your experience of using this website