Official statistic

Key performance measure 9

Value for money

KPM 9 provides a set of measures, using student survey and outcomes data, which can be used to consider value for money in higher education.

KPM 9A: Percentage of undergraduate students who say that university offers good value for money

KPM 9B: Percentage of undergraduate students responding positively to National Student Survey (NSS) questions about aspects of quality

KPM 9C: Proportion of students at providers with student outcomes indicators above our numerical thresholds

KPM 9D: Proportion of employers responding positively to Employer Skills Survey question about preparedness of graduate employees

Note: Survey responses are from establishments that have recruited university or higher education leavers in the last two to three years.

Many of our strategic goals relate directly or indirectly to the value for money of higher education for both students and taxpayers.

Previous research commissioned by the OfS suggests that students’ perspectives on value for money are primarily driven by the quality of teaching, assessment, feedback, and learning resources. Furthermore, the research shows that a significant proportion of students also value positive employment outcomes and earnings after graduation.

When students complete their courses and secure positive employment outcomes, this also represents value for money for taxpayers, who support the higher education system.

It is not possible to encapsulate all these aspects of value for money in one measure. KPM 9 presents a set of measures that, taken together, can be used to consider value for money in higher education.

KPM 9A presents data from polling commissioned by the OfS, which asked students about their views on value for money.

KPM 9B presents data from the National Student Survey (NSS) about the aspects of quality that previous research suggests are important to students in considering value for money.

KPM 9C presents data on student outcomes. This shows the proportion of students at providers with student outcome indicators above the numerical thresholds we have set.

KPM 9D presents data from the Employer Skills Survey. The measure shows the proportion of employers who think that graduates are prepared for work after graduation. 

If our interventions to improve quality and student outcomes have the desired effect, we would expect KPMs 9B and 9C to increase. This should also be reflected in improved student perceptions of value for money, as measured by KPM 9A, and in improved employer perceptions of how prepared graduates are for work, as measured by KPM 9D.

KPM 9A

KPM 9A shows that our 2023 polling of students in England suggests that around 52 per cent of current undergraduate students think university offers good value for money, considering the costs and benefits. This is an increase from 46 per cent in 2022.

However, this result should be interpreted with caution for several reasons.

The polling company we commissioned surveyed only a subset of current undergraduate students. It is possible that the views of this subset differ from the views of students more generally, simply because of random variation.

There is also a chance that the method of data collection (surveying a sample of volunteers) introduces bias.

The 2022 and 2023 data was collected using a different polling company to previous years, with a different panel of students. This change means that the results from 2022 onwards may not be comparable with previous years.

The Student Academic Experience Survey (conducted by Advance HE and the Higher Education Policy Institute) also measures students’ perceptions of value for money. In 2023, the survey found that 37 per cent of undergraduate students in England reported higher education to be ‘good’ or ‘very good’ value for money. This was a slight increase from 34 per cent the previous year.

KPM 9B

KPM 9B shows the percentage of students who responded positively to NSS questions about the teaching on their course, the assessment and feedback they received, and the learning resources available to them.

Results from 2023 are shown separately as they should not be directly compared. In particular, the removal of the ‘neutral’ response option in 2023 means that we would expect more students to respond positively to the new survey, regardless of any change in the student experience.

While the three theme measures used for the 2023 results have the same titles as the earlier theme measures, for the 2023 data they are published as experimental statistics. This means there is not enough evidence to be sure that they are the best way of summarising the NSS results. Therefore, these theme measures are subject to change in the future.

In the 2023 NSS, on questions about the teaching on their course, 84.7 per cent of students responded positively. On questions about assessment and feedback, 78.0 per cent of responded positively. On questions about learning resources, 86.1 per cent responded positively.

KPM 9C

KPM 9C shows that the proportion of students at providers where the relevant continuation indicator is above our numerical threshold (at 95 per cent statistical confidence) is 82.5 per cent for the most recent year. This decreased from 86.1 per cent over the previous five years.

For completion, KPM 9C shows that the proportion of students at providers where the relevant indicator is above our numerical threshold (at 95 per cent statistical confidence) is around 89.7 per cent for the most recent year. This increased from 86.2 per cent over the previous five years.

For progression, KPM 9C shows that the proportion of students at providers where the relevant indicator is above our numerical threshold (at 95 per cent statistical confidence) is 87.8 per cent. This increased from 86.6 per cent over the previous four years.

KPM 9C is subject to potential volatility that can affect year-on-year comparisons. If a provider’s performance is close to the numerical threshold, random statistical variation may mean that its indicator value moves above or below the threshold in different years. In some years we may have 95 per cent statistical confidence that its indicator value is below our numerical threshold and in other years we may not. This might mean that a provider is included in the KPM 9C data in some years and not in others, even where there is not a material change in its performance. This can lead to year-on-year variations, which may be more marked if large providers are included in only some of the of the years shown in KPM 9C.

KPM 9D

KPM 9D shows the proportion of employers who think that graduates are prepared for their first job after graduation. The figure shows that a high proportion of employers in 2019 and 2022 responded positively to the work preparedness question. 

In 2019, the Employer Skills Survey expanded its scope to include a question asking about the ‘preparedness for work of university or higher education leavers’. This was repeated in the 2022 survey and presented here as a time series.

KPM 9A

For KPM 9A, our polling company collected 668 responses between 21 March and 6 April 2023 from UK undergraduate students in England to the question 'Considering the costs and benefits of university, do you think it offers good value for money?'. This measure is based on the per cent of respondents that answered 'Yes'.

KPM 9B

For KPM 9B, the NSS surveys undergraduate students in the spring of their final academic year. The 2023 NSS achieved a 71.3 per cent response rate in England, with 290,706 students taking part in the survey.

The positivity measure shows the percentage of respondents at English higher education providers registered with the OfS, who gave a positive answer to three theme measures in the NSS: the teaching on my course; assessment and feedback; and learning resources. The positive answers are the first two possible answers to each question.

The theme measures are positivity measures calculated for groups of questions. For example, the theme measure for “teaching on my course” summarises into a single value the positive responses to all four of the questions within this theme. For more information on how theme measures are calculated, see ‘How do you calculate theme measures’ in About the NSS data.

KPM 9C

The continuation, completion and progression indicators used in KPM 9C are the same as those used in our regulation of student outcomes. The indicators are calculated from individualised student data from the Designated Data Body (DDB) Student Record and the Individualised Learner Record from the Education and Skills Funding Agency (ESFA). OfS-registered providers submit data as required to the relevant data body. A subset of providers not registered with the OfS also submit data to these data bodies. The progression measure also links this individualised student data to survey responses from the Graduate Outcomes survey.

Table 1 shows the version of data used for the KPM calculations. Previously published KPM values remain unchanged by recent data amendments affecting earlier years of data.

Year of entry/year of qualifying date

Data used for the calculation

Continuation: years 1 to 4

Completion: years 1 to 4

Progression: years 1 to 3

Data submitted and signed off by the providers accountable officer or approved data amendments signed off by 20 July 2022

Continuation: year 5

Completion: year 5

Progression: year 4

Data submitted and signed off by the providers accountable officer or approved data amendments signed off by 5 May 2023

Continuation: year 6

Completion: year 6

Progression: year 5

Data submitted and signed off by the providers accountable officer or approved amendments signed off by 29 May 2024.

Providers were required to submit 2022-23 DDB student data using a new data model and a new data platform, and the data collection encountered a number of delays. Consequently, additional risks for the quality of data were tolerated in some areas of the 2022-23 data returns and we are assessing the impacts of those additional risks for each OfS data output on a case-by-case basis. For this data output, our data quality assessment has indicated that the additional risks have not impacted on the reliability of the data in a material way.

Continuation outcomes are measured by identifying a cohort of entrants to higher education qualifications and following them through the early stages of their course to track how many continue in active study or qualify one year and 15 days after they started (two years for part-time students).

Completion outcomes are measured by identifying a cohort of entrants to higher education qualifications and following them through subsequent years of their course track how many continue in active study or qualify four years and 15 days after they started (six years and 15 days for part-time students).

Progression outcomes are measures as the proportion of Graduate Outcomes survey respondents that reported they have progressed to professional or managerial employment, further study or other positive outcomes, 15 months after gaining their qualification. The data is restricted to UK-domiciled qualifiers.

Further information about the indicator definitions can be found in the document ‘Description of student outcome and experience indicators used in OfS regulation’.

KPM 9C calculates the proportion of students taught at OfS-registered providers with student outcome indicators that are above our numerical thresholds, based on indicators for which at least 95 per cent of the distribution of statistical uncertainty falls above the relevant numerical threshold. The numerical thresholds for each measure, mode and level of study can be found in the document ‘Setting numerical thresholds for condition B3’.

The proportion of students is calculated separately for each student outcome measure, by considering the headcount number of students in the populations of those indicators above the numerical threshold, divided by the total headcount number of students in the population for that measure across all OfS-registered providers. The numbers of students with outcomes above numerical thresholds are calculated separately for each mode and level of study before being aggregated to give an overall proportion for each measure.

KPM 9C is based on the population of students primarily studying in the UK taught by OfS-registered providers. This encompasses students that are registered with and taught by an OfS-registered provider, or (where data is available) are taught by an OfS-registered provider under a sub-contractual partnership arrangement with a different provider. 

KPM 9C does not take account of judgements we may make about whether an individual provider has met our minimum requirements for student outcomes. Our approach to making such judgements is set out in regulatory advice 20.

The first four years of data (three years for progression) shown for this KPM include providers who were registered with the OfS as of 30 September 2022. Later years of the time series include providers who were registered with the OfS at the point of publication of the summer release of student outcomes data. This means the number of providers included in each year will vary.

KPM 9C is subject to potential volatility that can affect year-on-year comparisons. If a provider’s performance is close to the numerical threshold, random statistical variation may mean that its indicator value moves above or below the threshold in different years. In some years we may have 95 per cent statistical confidence that its indicator value is above our numerical threshold and in other years we may not. This might mean that a provider is included in the KPM 9C data in some years and not in others, even where there is not a material change in its performance. This can lead to year-on-year variations, which may be more marked if large providers are included in only some of the of the years shown in KPM 9C. For further information about the effect of statistical uncertainly on the measure please see the ‘assessment of uncertainty’ document (PDF).

KPM 9D

KPM 9D presents the data from the Employer Skills Survey published by the Department for Education. The survey is run every three years and has a weighted sample of around 1.7 million establishments in England. Only establishments who had recruited university or higher education leavers in the last 2-3 years were asked the graduate preparedness question, with a weighted response of around 240,000 establishments. 

The proportion presented here is the sum of establishments who responded ‘Very well prepared’ or ‘Well prepared’ when asked about the preparedness for work of university or higher education leavers, divided by the total establishments asked the question. This calculation uses the weighted response data.

As this is a sample survey, weightings are applied to the raw response data. This uses information from the Inter-Departmental Business Register to ensure that the responses from the sample are scaled up or down to better reflect the likely responses had all UK businesses been surveyed. In all surveys of this kind there is a chance that estimates are subject to sampling error and using weighted data helps to mitigate this risk.

Contact us

If you have any queries, feedback or suggestions about KPM 9, please contact Mark Gittoes at [email protected].

Published 08 September 2022
Last updated 17 September 2024
17 September 2024
Updated KPM 9C, published KPM 9D.
09 October 2023
Updated KPMs 9A and 9B to include NSS 2023 data.
08 August 2023
Annual update to KPM 9C data.
03 November 2022
KPM 9C published

Describe your experience of using this website

Improve experience feedback
* *

Thank you for your feedback