The evaluation challenge: understanding what works in access and participation

Two new reports add to a growing body of knowledge about the impact of access and participation activities and identify ways to overcome the challenges of evaluation.

A scale of happy to sad face icons with a finger pointing at the happiest face

We’re up for the challenge

Evaluation is not easy work. I’ve lost count of the number of times someone has said ‘I don’t envy you!’ after a discussion about evaluation and the complexities involved in understanding what, how, and why something works, or doesn’t. But without evaluation – without striving to understand how to design programmes and policies that truly benefit people – progress towards equal opportunities in higher education is much harder.

We’re excited to see providers’ access and participation plans and the Uni Connect partnerships committing to ambitious evaluation strategies. If they deliver on these commitments, we can expect to see increased understanding of the impact of different interventions over time. This should mean greater progress on reducing equality gaps.

We’ve massively upped the ante in our regulatory and funding requirements on evaluation. But we don’t expect providers and partnerships to meet our expectations without support. We know that effective evaluation remains a challenge for many universities and colleges. The OfS has a key role in driving improvement through our regulatory guidance and effective practice resources. We’re also committed to improving our own practice. Our evidence and evaluation strategy, launched in February last year, outlines our ambitious vision not only for the sector, but also for our own work at the OfS.

Moving towards a step change in evaluation practice

As part of our support role, we’ve developed an evaluation toolkit to improve understanding of the impact of financial support. There’s been a lot of debate about the impact of bursaries over the years. Previous analysis found no evidence that institutional bursary schemes have an observable effect on choice of where to study, or continuation rates, for young full-time first-degree students. However, this analysis did not seek to understand the complexity of bursaries within the post-2012 fee system. So we worked with academics from across the sector to help us understand the impact of financial support in this context. The result was a new evaluation toolkit that has enabled providers to use sophisticated mixed method evaluation involving quantitative data analysis, and validated survey and interview tools. These tools have been designed by academics and practitioners and tested on the ground.

Today we’ve published findings from providers who have used the toolkit. The results are fascinating, and suggest that financial support can have a positive impact on student experiences. The report also represents a step change in evaluation practice – a move away from weak approaches that lack a strong narrative towards those that seek to understand change rather than simply capture attitudes.

There’s a long way to go on improving evaluation practice, but this is a step in the right direction. Evaluation outcomes are only one part of the story, of course. The real impact will be shaped by how providers interpret and use their findings to improve practice so that their students receive the best support.

Context is key

An evaluation method is only as good, or as strong, or as useful as the quality of its design and implementation, and its appropriateness to context. The diversity of opinions in higher education research and evaluation makes this an exciting field. We must embrace this diversity in working together to achieve our common goals. Whatever your methodological or epistemological preference, and whatever the context of your evaluation, the most important thing is generating the evidence that can improve your practice for students, staff and society.

Last February we published comprehensive guidance on evaluating outreach initiatives and new Standards of evidence to help providers understand the strength of their evidence. We also provided an evaluation self-assessment tool to help providers make judgements about their current practice and identify areas for improvement.

Continuing to support the sector

We understand the importance of good data for effective evaluation. The second report we’re publishing today looks at how data is currently used in targeting, monitoring and evaluating access and participation work. We expect to follow this up with a number of new projects to support the use of data.

The OfS’s commitment to generating and making evidence more accessible is highlighted by our investment in TASO, the new ‘what works’ centre for higher education social mobility. Last month saw its first publication, and we’re looking forward to it increasing its reach and impact over the coming years.

We’re also investing to understand the impact of Uni Connect, which brings together partnerships of universities, colleges, schools and others across the country to offer activities, advice and information about higher education. This work includes a programme-level evaluation, internal data analysis and a capability building team – working on the ground with partnerships to help improve their practice. We also have plans to develop evaluation resources for new and smaller providers and to improve the use of qualitative data in evaluation. We can’t do this on our own, so we’ll be looking to the sector to work with us.

Comments

There are no comments available yet. Be the first to leave a comment.
Leave a comment
*
*
*
Published 18 February 2020

Describe your experience of using this website

Improve experience feedback
* *

Thank you for your feedback