About the NSS data

Changes in NSS 2023

As a result of the review of the NSS, the survey questions and response scales have changed for NSS 2023. The main changes include:

  • new direct questions with a 4-point item specific response scale replaces for core and additional questions, replacing the Likert response scale (with the exception of the overall satisfaction, healthcare practice placement questions and optional bank questions)
  • a new additional question on mental wellbeing services
  • a new additional question on freedom of expression, which is asked to students in England only
  • the overall satisfaction question is asked to students in Scotland, Wales and Northern Ireland only
  • an expansion of the student characteristics splits presented in the results
  • introduction of suppression thresholds for cases where everyone (or all but one) in a group responds negatively
  • updated structure for the dissemination data results files.

Because of these changes, it is not valid to compare question level responses from NSS 2023 with those from previous years. This is because it is a completely new version of the NSS. Many questions are new, or wording has been changed. Even though some of the questions are the same (or very similar), the response options which are offered are different. We do not endorse any presentations of the data which compare or aggregate question-level data in this way as this may be misleading.

Find out more about the changes and why we made them

FAQs

All students are asked core questions relating to the following aspects of the student learning experience:

  • Teaching on my Course
  • Learning Opportunities
  • Assessment and feedback
  • Academic Support
  • Organisation and Management
  • Learning Resources
  • ? Student Voice

The new additional question on mental health and wellbeing services is asked to all students, however the additional new question on freedom of expression is asked to students in England only. The final overall satisfaction is asked to students in Scotland, Wales and Northern Ireland only. All students are asked to give positive and negative comments on their student learning experience as a whole.

The response options for the NSS 2023 survey vary from question to question. This is because the survey has moved to using item specific response scale for the core questions. The optional bank questions, the overall satisfaction question (asked to students at non-English providers) and healthcare practice placement questions will continue to use a five-point Likert Scale. For the remaining questions, in some cases the student will be asked to respond in terms of degree (for example ‘to a great extent’) and some in terms of frequency (for example ‘very often’). However, the response options will always present from a most positive to a least positive choice over a four-point scale.

Using the NSS data

Providers may wish to use their NSS results in marketing materials, to promote particular courses or the provider as a whole. This is permitted, but there are several restrictions depending on the type and quality of the data being shared.

The publication thresholds (a minimum 50 per cent response rate and at least 10 students) must be adhered to at all levels. Any results below this threshold must not be released publicly.

Published data

Published data which is results of the core and additional questions at course level may be used in the following ways:

  • These results can be used in marketing materials and may be attributed to the NSS.
  • Publication thresholds of a 50% response rate and 10 responses must be adhered to at all levels.
  • Open text comments (in response to the open text question which asks, ‘Looking back on the experience, are there any particularly positive or negative aspects you would like to highlight?’), is not public data but this qualitative data may be paraphrased in marketing materials, as long as they do not identify any individuals and are not attributed to the NSS.

This is the data that is available in Excel format or through the visualisation tool or through Discover Uni.

For example:

97 per cent of students at ___University/college responded that staff were good at explaining things in the National Student Survey 2023

97 per cent of students on our ___ course were positive about staff making the subject engaging in the National Student Survey 2023

Theme measures

We have published theme measures this year as experimental statistics. This is because we do not yet have enough evidence to be sure that the current theme measures are the best way of summarising the results. We are planning further work on this 2024, which may lead to small changes to the way we calculate theme measures in the NSS 2024 publication. This means that while you may wish to use the theme measure, you should be aware that they may change, and that in some cases there may be better ways of presenting the data. We recommend that you consider the questions which make up each group and do not rely only on the theme measure. Theme measures should not be used in marketing or presented to onward users without the questions which were used to create that theme.

For example:

80 per cent of students at ___University/college were positive about the teaching on their course, based on experimental statistics from the National Student Survey 2023. (The following scores combined to produce this theme: 70 per cent of students said staff were good at explaining things, 85 per cent said staff often made the subject engaging, 90 per cent said the course was intellectually stimulating and 65 per cent said they were challenged to achieve their best work.)

Unpublished data

Unpublished data may also be used in marketing materials, but these may not be attributed to the NSS.

This includes results for the optional bank questions and provider-specific questions, available to providers through the NSS data dissemination portal. This data is largely for internal purposes only, for providers to identify and develop activities for quality enhancement. However, unpublished data may be used in the following ways:

  • These results can still be used in marketing materials but may not be attributed to the NSS. In order for results to be used in marketing materials, the publication thresholds of a 50% response rate and 10 responses must be adhered to at all levels.
  • Open text comments from institution specific questions may not be used in marketing materials.
  • Institutions are able to share open text comments data between themselves and a third party but this must be on the basis of the third party processing data on behalf of the institution for the purposes set out in the NSS privacy notice. The institution must ensure that a data sharing agreement is in place with the third party with contractual provisions that the data is not shared with any other party.

For example:

90% of students agree that ‘Teaching staff test what I have understood rather than what I have memorised'

95% of students on ___ course agree with the statement ‘My higher education experience has helped me plan for my future career'

The results of the NSS are subject to the NSS privacy statement, which states that anonymised open text comments are only shared with the relevant provider or funding body, or with OfS-approved researchers of sector organisations. Therefore, providers may not quote open text comments in marketing materials.

However, open text comments from the core NSS questionnaire may be paraphrased in marketing materials, as long as the text does not identify any individuals, and the comments are not attributed to the NSS.

Open text comments from the optional bank questions or the provider-specific questions may not be used in any form.

The registering provider has overall control of a programme’s content, delivery, assessment and quality assurance arrangements. This is not necessarily where the student is taught, and a registering provider can allow another provider (the teaching provider) to deliver all, or part, of a programme that is designed, approved and owned by the registering provider. This is known as a sub-contractual, or sometimes franchise, arrangement.

The teaching provider, in the context of a sub-contractual arrangement, is that which delivers higher education provision to students on behalf of another higher education provider (the registering provider). For the NSS, the teaching provider is defined as the organisation supplying the majority of teaching in the penultimate year of the course.

Please be aware that data on Discover Uni is attributed to the teaching provider.

Understanding the NSS data

The positivity measure for each question is the proportion of respondents who gave a positive answer. The positive answers are the first two possible answers to the question. 

For example, the first NSS question is “How good are teaching staff at explaining things?” The response options are:

  • Very good
  • Good
  • Not very good
  • Not at all good
  • This does not apply to me

In this case, we calculate the positivity measure by dividing the number of students who answered “Very good” and “Good” by the number of students who gave one of the first four response options. We express this as a percentage.

In calculating the positivity measure, we do not include students who responded “This does not apply to me”.

Because the student populations are so different across providers, and other groups of interest, it is not always helpful to directly compare them. For example, it may be the case that a difference in positivity measure between two providers can be accounted for by their different subject mix, or the different ages of their students.

To account for some of the factors which contribute to different NSS results, we construct benchmarks for each group of interest and each question. These benchmarks are sector averages, which we adjust based on the characteristics of the group that we are interested in. We can think of the benchmark as a prediction of what the NSS results for the sector would have been, had the sector had the same breakdown of students and subjects as the population we are interested in.

The benchmarks can be used to understand whether the positivity measure for a student group is higher, or lower, than would be expected given the factors used in benchmarking.  For example, suppose that the positivity measure for Question 1 (“How good are teaching staff at explaining things?” is several percentage points above the benchmark.  This tells us that students’ responses to this question are more positive than the sector average, and that this cannot be accounted for by the subject mix at the provider, or any of the other factors we use to construct the benchmark.

The data dashboard is designed to help data users interpret the relation between the positivity measure and the benchmark.

The factors that we take into account when constructing the benchmarks are:

  • Mode of study
  • Subject of study
  • Level of study
  • Age
  • Ethnicity
  • Disability
  • Sex

However, fewer factors are used for part-time and apprenticeship students, particularly for question 28 (overall satisfaction). This is due to the smaller population size for these groups.

For the NSS student characteristics results, characteristics are taken out of the benchmarking factors when we show results for that characteristic. For instance, ethnicity is not used as a benchmarking factor when ethnic group is the characteristic being displayed. 

Find out more about our approach to benchmarking

The theme measures are positivity measures calculated for groups of questions. For example, the theme measure for “Teaching on my course” summarises into a single value the positive responses to all four of the questions within this theme. You may wish to use theme measures to get an overview of the NSS results for a provider or another publication unit. The themes that we use largely correspond to the question groupings in the NSS 2023 questionnaire

This year, we have published the theme measures as experimental statistics. This is because we do not yet have enough evidence to be sure that the current theme measures are the best way of summarising the results. We are planning further work on this 2024, which may lead to small changes to the way we calculate theme measures in the NSS 2024 publication. This means that while you may wish to use the theme measure, you should be aware that they may change, and that in some cases there may be better ways of presenting the data. We recommend that you consider the questions which make up each group and do not rely only on the theme measure. Theme measures should not be used in marketing or presented to onward users without the questions which were used to create that theme.

Find out more about our initial review of the theme measure

If you have any feedback on these measures, please email [email protected].

To create the theme measures, we first calculate a positivity measure for each respondent within the unit, and then construct the theme score as the mean of the individual positivity measures. This method ensures that each individual student has the same weight in calculating the theme measure, regardless of how many questions they answered with a response other than  “this does not apply to me”.

The example below illustrates how we would calculate a theme measure for “Learning Resources” for a publication unit. To simplify, the example assumes that there were only five respondents within the unit.

Example: how we calculate theme measures

Respondent 

Response to: “How well have the IT resources and facilities supported your learning?”

Response to: “How well have the library resources (e.g. books, online services and learning spaces) supported your learning?”

Response to: How easy is it to access subject specific resources (e.g. equipment, facilities, software) when you need them?”

Positivity measure for  respondent (%) 

“Very well” 

“Well” 

“Easy” 

100 

“Well” 

“Well” 

“This does not apply to me” 

100 

“Not very well” 

“Not at all well” 

“Not very easy” 

“Well” 

“Not very well” 

“Easy” 

66.7 

“Very well” 

“Not very well” 

“Very easy” 

66.7 

Note that the “This does not apply to me” option is excluded when calculating the positivity for respondent B.

Calculation:

Number of respondents = 5 

Total positivity measure (%) = 100 + 100 + 0 + 66.7 + 66.7 = 333.3 

Learning resource theme measure (%) = 333.3 / 5 = 66.7

We publish NSS results for a unit (for example: a provider, a country, or a subject area within a provider) whenever we can without compromising the quality of the statistics, or our commitments to fair data processing.

We do not publish results for a unit in the following circumstances:

  • When the response rate for the unit is less than 50 per cent. This is to control the risk that the views of respondents are different from the views of the non-respondents, making the estimates less accurate. Statistics related to these units are entirely removed from the published results.
  • When fewer than 10 individual students within the unit responded to the NSS. We do this to reduce the risk  that anyone looking at the NSS results could identify how an individual student responded. Statistics related to these units are entirely removed from the published results.
  • When the response rate for a publication unit is 100 per cent, and all, or nearly all, the students responded negatively to a particular question. This is to ensure that students feel able to honestly report poor quality, without risk of being identified.  This suppression is very rare; when it occurs, we indicate that the positivity measure for the question is very low using the marker “DPL” (Data protection low), but otherwise provide minimal information.
  • When, for a publication unit, a theme includes a question that is DPL suppressed. In this case, publishing the theme measure could allow data users to infer information about the suppressed DPL measure.  We therefore suppress the theme measure too, and mark it as “DP” (data protection).  

When we have other concerns about the quality of the data.

Find out more about our approach to publication thresholds

All numbers in the NSS publication are rounded to 1 decimal place. All calculations are performed on non-rounded numbers, with the rounding completed as the last stage of processing. Occasionally, numbers may appear not to sum as expected due to rounding.

In the NSS publication, we generally count students using full-person equivalence (FPE). Each respondent counts as a single person, regardless of whether they are studying full-time or part-time. When a respondent is studying across several subject areas, their response is attributed to these subjects in accordance with the split reported to us by their provider. 

For example, suppose that a student spends half their time studying Business Studies and half their time studying French. This student will count as half a person (0.5 FPE) when calculating the results for the French subject area, and likewise for Business Studies. We take this approach regardless of whether the student is studying full-time or part-time.

Due to the way we count students using full-person equivalence (see above), you will sometimes see that the total number of respondents is reported as a decimal. This will only occur when the results are split by subject of study.

Concepts specific to the NSS publication – such as the theme measures and the positivity measure are defined in this question and answer set.

Our treatment of subject of study draws on the Common Aggregation Hierarchy (CAH), which is maintained by JISC.  For further information about the Common Aggregation Hierarchy, see Common Aggregation Hierarchy (CAH) | HESA

Other concepts used in the publication generally follow the definitions developed by the OfS in our approach to measuring quality and standards, as set out in our ‘Technical algorithms for student outcome and experience measures’ document. For example, the concept of level of study used in the NSS mirrors the definition of “IPLEVEL” given in this document.

Find out more about our approach to benchmarking, uncertainty and publication thresholds

Published 10 August 2023

Describe your experience of using this website

Improve experience feedback
* *

Thank you for your feedback