The NSS: In a league of its own

The sheer scale of the National Student Survey makes it a powerful weapon for students as well as a reliable guide for applicants.

Students at university

When the National Student Survey (NSS) was first planned, I had my doubts. Most undergraduates would have nothing to compare their experience with, the process might be manipulated by unscrupulous academics or administrators, and a response rate of 50 per cent as a threshold for publication looked wildly ambitious.

Yet after 13 years of poring over the results, as a starting point for university profiles and a component of league tables, I am fully converted. Of course the NSS has its faults – even after last year’s introduction of improved questions, it remains an extremely broad brush exercise that unintentionally favours particular types of institutions and makes life difficult for others.

The biggest institutions – particularly in London - feel at a disadvantage, while many small and medium-sized universities and colleges prosper year after year. But if this reflects reality, with successive generations of undergraduates preferring the more personal touch they experience at smaller institutions, applicants should be aware of it. And if students get the impression that their lecturers are more interested in their research than their teaching, this is also valuable information.

The results do not provide the last word in the assessment of teaching quality, any more than the Teaching Excellence Framework as a whole does. But the results give the best available picture of students’ perceptions of their course – and it is difficult to see that being matched by any other exercise.

Which other survey can boast a response rate of 70 per cent, charting the views of 320,000 undergraduates? The trends are generally consistent (and overwhelmingly positive) – so much so that politicians and commentators often resort to quoting much smaller, less representative research to support a critical narrative. Satisfaction levels may be down this year, but still 83 per cent were positive about their course and only 8 per cent dissatisfied.

That is not to say that the NSS is perfect – in my view, it takes too narrow a view of students’ unions, for example, implying that their sole purpose is to represent their members academically. But more serious criticisms of the survey, that it encourages an ‘intellectual race to the bottom’ with lecturers dumbing down courses and reducing expectations to ensure positive results, are invariably anecdotal. There have been remarkably few examples of departments or institutions transgressing the rules.

All the current compilers of league tables remain content to make the NSS the biggest element in their calculations. While the way in which the results are used may vary, it has always been accepted that the views of those who have taken the course will be of most interest to anyone grappling with degree choices.

The survey’s outcomes have also provided unique leverage for students to force through improvements to services and facilities. In particular, levels of feedback and assessment practices have been given a focus that would never have been applied without the negative views expressed in successive editions of the NSS. Universities and colleges all over the country are producing much more prompt and effective feedback on students’ work as a result.

The numerous other examples of improvements attributable to the NSS include extensions of library opening hours, improved timetabling and better student facilities of various types. In their quest for good ratings, many universities and colleges mount publicity campaigns to remind students of the impact of previous surveys. Critics may sneer at such practices, but they demonstrate that the results can have an impact.

Even last year’s partial boycott of the NSS – now receding further – had more to do with the uses to which the results were being put at national level than dissatisfaction with the survey itself. Applicants would be much the poorer without the insight it provides.

 

The views expressed in this post are the author’s own.

Comments

Steve Walsh

Couldn’t agree more. There are many examples of institutions listening to students and making improvements because they want improve the NSS results. Yes part of the motivation is to improve in league tables and the TEF but listening and responding to students is good why ever it happens. Shame TEF has devalued the NSS just because the wrong universities did such a good job of improving student satisfaction with their teaching!

31 Jul 2018 - 1:01PM

Leave a comment
*
*
*

Describe your experience of using this website

Improve experience feedback
* *

Thank you for your feedback