Schools and university partnership working: Why evaluation and collaboration is the cornerstone of widening participation

John Blake, our Director for Fair Access and Participation, delivered the keynote address at the Centre for Transforming Access and Student Outcomes in Higher Education (TASO) conference on Thursday 28 April 2022.

John Blake speaking at the TASO Conference 2022.

In my time, I have been a teacher, a policy researcher and now a regulator and I have been to a fair number of events where people claim to be “enormously excited” about issues or institutions that, frankly, you doubt they’ve given much thought to – it is a cliché that sadly robs us of the ability to really know when people genuinely are excited by something.

So all I can offer to prove that I really am enormously excited by TASO’s work is that some of my loved ones have recently refused to carry on conversations with me about evaluation of widening participation work, on the grounds that there are other people paid to have conversations with me about it and, actually, we’re only on episode 2 of Moon Knight and time is passing.

As such, you now all have to bear the brunt of my enthusiasm.

The reason I have made evaluation my highest priority in my new role is in large part because I lived through the transformation in discussions about schools policy over the past 15 years.

I became a teacher in 2006 and, so Elon Musk’s newest acquisition tells me, joined Twitter in 2009. In those years, a revolution in the use of evidence in school policy was already building that blogging and social media then enormously expanded in scale, and made mainstream.

Whilst schools policy had ceased to be a secret garden tended only by teachers long ago, it was, when I entered the profession, still a profoundly hierarchical debate. Contested, to be sure, but largely argued over between government ministers and their advisors, trade union officers, and university academics in education departments. The loudest school leaders could, through sheer persistence, win a seat at the table, but those at the chalkface day-in and day-out were rarely in the room where it happened, and their hard-won expertise was too easily ignored where it did not match the preferences of those who claimed to speak for them.

But as new communities formed through social media, this edifice began to crack. Ofsted felt it first, as the content of Ofsted reports was pulled from school websites and compared side-by-side to identify contradictory statements from the inspectorate, questions were asked about their evidence for their claims of what classrooms should look like, and teachers with a keen eye for research and a keyboard did what years of angry bluster about boycotts had not done, and changed the framework of the most powerful accountability tool in English schooling.

The national curriculum, behaviour management, pedagogy, pastoral interactions, each in turn became subject to intense debate which had obvious, tangible impacts on the work going on in schools, and practitioners now did not need to fight their way past gatekeepers to be there, and the tool they used was, mostly, accumulated research evidence, from here and abroad, that painted a different picture from that which the leaders of the sector had previously thought sacrosanct.

I am not, at all, saying that every teacher who’d read a research paper was necessarily right about their suggested remedies for problems in the sector, nor that every school leader, professor of education or director of children’s services were wrong. Nor am I suggesting that schools policy is now a perfectly formed technocratic wonder in which only evidence matters – not least, because in an area as profoundly important as education, a democratic nation should want an ongoing, lively debate about priorities and practices that goes beyond the evidence.

But schools policy now operates on a more equal footing where once seniority was all, and in which new systems, such as the new standards for Initial Teacher Training providers, are explicitly tied to the best available evidence, such that as the evidence base changes, so too will the frameworks.

To my mind, this ecosystem of evidence-based practice is essential to the future of all public services. The test of theory is not only that it makes sense to other theoreticians but that it can also survive contact with the hard reality of frontline provision, and the test of practice is that energies and efforts are directed towards those activities which research tells us make the most difference in the most effective and efficient way.

As I say, this work is hardly complete in the schools sector, but I had thought it might be further advanced in higher education, where research excellence is often cited as one of the sector’s greatest contributions to the nation and the world.

But, whilst there certainly are pockets of excellence, I have discovered that a great deal of the work higher education institutions do, does not meet the standards of review and inquiry that are the baseline of academic endeavour.

That is why the work of TASO is such an essential and exciting contribution to the work of widening participation. It sits at the heart of a mission to ensure that access and participation works as a discipline: defined, investigated and improved, with the same rigour, open-mindedness and commitment to evidence over gut feelings that guide other fields of academic study.

Since its inception, the Office for Students has been committed to that agenda, and the existence of TASO owes a great deal to my predecessor, Chris Millward, who also did great work in pushing evidence and evaluation forward within Access and Participation Plans. But there is still more than can be done and has to be done to ensure that the collective work of access officers, assessment writers, pastoral advisors, careers counsellors and everyone else doing the hard yards of this work can be confident they are making the strongest impact now and are contributing to improving our knowledge base so that more and better can be done tomorrow.

We can already see the fruits of a stronger focus on evaluation emerging, expanding our knowledge of what works but also helping improve our capacity to talk about some of the most difficult and persistent inequalities we face. For example, when I asked my team to pick out some of the best evidence-led work they’d seen, Kingston University development of an inclusive curriculum framework was brought up – this work contributed to a 16 percentage point reduction in the BME awarding gap at Kingston over 6 years. Important and impactful, but especially interesting to me, because we are also aware of other work which suggests that there are limits to what can be done about the awarding gap through curriculum alone.

These are challenging issues but ones where robust evaluation helps us avoid both the temptation to believe in silver bullets but also the disappointment of seeming lack of success. Failure and dead-ends are as much, perhaps more, an essential part of building disciplinary knowledge as eureka moments.

Other work, such as the University of Oxford’s Oxplore-Raise programme, is demonstrating ways of evaluating projects where the ultimate aim is a very long term one. In this case – something close to my heart – a project to raise attainment in schools to expand access to higher education. But we cannot simply wait the multiple years it would take for the subjects of the intervention to reach (or not reach) HE. Instead, the project identifies meaningful intermediate objectives which allow success to be evaluated as the intervention occurs.

It is crucial that we get this right as we move into the next stage of Access and Participation Plans. Colleagues may be aware that we are currently asking all institutions holding an APP to seek a variation to the plan to cover work they are doing and work they intend to do on our priority areas of strategic school engagement, quality, non-traditional pathways, and evaluation. A new cycle of APPs will begin from next year.

TASO has released and will shortly be releasing more work that providers should look at carefully in considering how they will enhance their commitment to evaluation in the next round of access and participation work.

In particular, I would draw attention to the work TASO has done on ethics in such research – I know from colleagues across the sector that this represents a challenge, and TASO has generated a very helpful guide to support practitioners with this. I am really looking forward to the upcoming publications on the impact of summer schools, a type of intervention which engages a huge number of students, but also the publication of work on small n-sizes, helping providers who may not have huge numbers of students in their cohorts contribute to the expansion of knowledge too.

This is extremely valuable work from TASO which can and should make a real difference to the sector’s impact on students from disadvantaged backgrounds and underrepresented groups.

At the OfS, evaluation will be an essential part of our work in the next cycle of APPs. We are currently considering how to ensure that validation of effective practice has the same status as the validation of academic quality and the validation of data. Those two areas, as colleagues will likely know, are undertaken by a specifically designated body whose work is independent but built directly into that of the OfS. We are keen to see how we can establish an equivalent process for effective practice in access and participation.

Evaluation may not be as exciting to everyone as it is to those of us here, and that’s fine.

But we know that the impact of the work TASO is doing and is supporting and encouraging the wider HE sector to do, is a profoundly important contribution to the fulfilment of the promise of higher education for every student, regardless of their background.

And that everyone should find exciting.

Comments

There are no comments available yet. Be the first to leave a comment.
Leave a comment
*
*
*
Published 06 May 2022

Describe your experience of using this website

Improve experience feedback
* *

Thank you for your feedback