This is a guide to help higher education providers to use our survey tool for financial support evaluation, part of our financial support evaluation toolkit.
What is the survey tool?
The survey tool is a set of survey questions designed to help higher education providers understand how recipients used the financial support offered to them by the provider, and the perceived value to recipients of that support.
The survey questions were piloted and cognitively tested to improve validity during a research project led by Sheffield Hallam University.
Guidance for providers
The survey tool helps you to understand how and why financial support affects academic, personal and social outcomes.
Where possible, this should be used alongside the statistical tool to understand whether financial support is effective in relation to key academic outcomes.
The survey was designed to help students consider their financial support package as a proportion of the financial resources they lived on during the previous academic year.
Survey questions investigate: how the additional support helped students to financially navigate that academic year; what this enabled them to do; and what might not have been possible otherwise (e.g. taking paid work, gaining other work or development experiences).
There are also questions designed to explore the degree of ‘belonging’ or ‘attachment’ students feel towards the institution, to understand the extent to which financial support fosters this.
What you will need:
- Basic unique identifying data and email addresses for your bursary recipients
- Expertise in survey research methods
- Intermediate data analysis skills
- Access to a survey data collection tool (e.g. Bristol Online Survey)
- Ethical clearance or advice in line with your own institution’s policies.
Additionally, you may also benefit from:
- Additional demographic and other institutional data about your bursary recipients
- Advanced data and statistical analysis skills.
When is the right time to do the survey?
It is up to you when you choose to do the survey.
The institutions involved in the pilot research found a time between November and December was most suitable for them. This was because they thought students may more easily recall their experiences in the previous academic year at this time, and a clash is avoided with the National Student Survey (which normally happens during the second semester, January to April).
How long will it take?
Using the Bristol Online Survey (BOS) tool, the pilot institutions found that it took less than a day for registered BOS users to import and distribute the survey. Obviously, some time will then need to pass to collect responses.
Analysis time varies depending on the nature and extent of analysis. A trained analyst could expect to spend one day completing a detailed analysis, including cross tabulations.
You will also need to allow time with different stakeholders to interpret your findings, then decide if and how to make changes in response to what you have learned.
Sampling: who should complete the survey?
The survey is designed for students who received financial support in the previous completed academic year. Therefore, it can be completed by any continuing students, up to and including those in their final year.
Ideally, you could interpret your survey findings alongside statistical and interview findings for the same cohort years, however this may not be possible at this stage due to data availability.
As with any survey, you should carefully consider your sampling methodology to ensure a robust approach.
Response rates and self-selection bias may affect the validity and transferability of any findings.
You may consider incentivising respondents to encourage representative samples, and weighting survey responses to address these limitations.
We recommend that you draw on experienced research expertise, which you may have in your institution, to identify and address any weaknesses in sampling methodology.
Ensure that limitations in survey methodology are considered before drawing significant conclusions or making substantial claims based on this data alone.
How to administer the survey
You can administer the survey using your institution’s preferred survey tool.
We recommend using the survey questions developed and piloted by the research team.
The pilot institutions found many benefits to using the BOS tool. Here, each bursary holder gets a unique version of the online survey with demographic and institutional data already included as hidden fields. This shortens the survey for recipients, reduces errors in data, and the link to student data records enables more sophisticated analysis. It is possible to easily add your own bespoke questions to the survey.
Step-by-step instructions on using BOS to administer the survey.
If you are using BOS, you can also import the survey questions to your own profile using a .json survey template (right-click on this link and select the option to save or download link, to save the .json file to your own files).
Analysing survey data
We recommend that you draw on experienced research expertise, which you may have in your institution, to analyse your survey data. You may also wish to refer to an example of analysis performed by one of the pilot institutions.
Your analysis may also be driven by any questions arising from the statistical analysis and interview data if these findings are available.
Options to use other linked data for advanced analyses
Your survey data can be linked with student demographic and institutional data. Assuming large enough numbers, cross-tabulation analysis can be carried out into the key survey response variables and student record data.
Be careful to consider any ethical concerns that may arise from linking data. The survey introductory text advises respondents that “survey data will be completely anonymised at the reporting stage and your student data remains confidential and subject to data protection protocols”.
Responses can also be analysed in relation to the main demographic variables in the student record, (e.g. Bursary type and value; RHI; Entry requirements; JACS code; Sex; Disability; Age; Ethnicity; Distance from home; Area Disadvantage/POLAR3) to provide a more sophisticated understanding of how recipients use and value institutions’ financial support packages.
You may also be able to use additional linked data from your institution, to analyse the survey findings with other data (e.g. library or Virtual Learning Environment access, attendance monitoring, grades for completed modules, student ambassador roles).
Consider possibilities in line with any widening participation strategic performance issues you have identified at your institution and questions arising from other evaluation or research activities.
Interpreting your results – what does it mean?
Find out more about how you might interpret your findings and plan your next steps