Our Director of Strategy and Delivery, Josh Fleming, explains our position on artificial intelligence.

As the independent regulator for higher education in England, our role is to ensure that students from all backgrounds benefit from high quality higher education, delivered by a diverse, sustainable sector that continues to improve. We do this against a backdrop of social, economic and technological changes, perhaps none of which have captured the public imagination more than the growth of artificial intelligence (AI).
Use of AI among students is fast increasing, pointing to a future in which AI is an everyday part of the academic experience; for some students it may already be.1 Beyond higher education, AI has the potential to reshape the working lives of today’s graduates.
Used well, there is a huge opportunity for AI to change students' experiences of higher education for the better. Keeping pace with this change is an important part of our role as a regulator.
What we have heard so far
During our strategy engagement, AI listening sessions and the May student debrief, students and student representatives spoke enthusiastically about the possibilities of AI. Students recognise the importance of AI literacy to their future careers and are excited about the responsive support AI may facilitate.
We heard about the diverse ways AI is being used by institutions to meet students' needs in different academic contexts. AI can enable immersive training that replicates real-world scenarios and provide real-time feedback on students' work. The expectation for institutions to embrace these opportunities is widespread. Students are using AI in a range of ways to support their learning and prompt creativity, such as to break down complex information, language and tasks, visualise data and write computer code. To help them make best use of AI, students emphasised the importance of clear, consistent and accessible guidance which considers varying access to AI technologies.
Our discussions with policy experts, sector representatives and institutions highlighted an appetite to integrate AI in ways that enhance students' experiences and outcomes. We also heard concerns that uncertainty about regulatory implications could deter institutions from fully leveraging AI opportunities. Institutions and sector representatives underlined the importance of collaborative working to identify, explore and resolve shared challenges and exploit joint opportunities.
Our position on AI
We take a principles-based approach to regulation, in part to support and encourage innovation and experimentation, whether in AI or beyond. There's no requirement for any particular scale or form of AI use. Indeed, AI may even help institutions meet and surpass the student-focused teaching and learning requirements we set in our regulation. By focusing on principles and outcomes rather than prescribing specific AI applications, we seek to foster an environment conducive to innovation while safeguarding the interests of students. Where AI integration has the potential to enhance the student experience, boost educational outcomes and equip students with the skills needed to serve labour market demand, we want as many students as possible to benefit.
The possibilities that AI brings are huge, but we recognise that AI comes with risks, particularly with regard to academic integrity, the meaning and credibility of assessment and qualifications, and dynamic effects on access and participation for underrepresented groups. Resisting AI integration, however, would be a temporary solution with significant opportunity costs for both institutions and students. The HEPI research cited above shows rapid acceleration in uptake among students, pointing to a future (and for some, a present) in which AI is an everyday part of students' higher education experiences. And with AI literacy increasingly important to employers, it's critical students develop the skills they need for their future careers.
The answer will instead involve thinking creatively about when, where and how AI can be integrated in ways that protect and improve students' educational outcomes. This will involve experimentation and will look different for different institutions. Likely applications will range across teaching, learning and student support. We recognise experimentation isn’t linear and that mistakes are likely to be made by well-intentioned practitioners and institutions as they grapple with emerging and rapidly changing technologies.
The following prompts are intended to facilitate experimentation by sketching out broad parameters within which institutions can develop their approach to AI. These questions won't have definitive answers, but they highlight the issues we would expect institutions to have considered.
- How does your approach support staff to harness the benefits of AI in their teaching and approach to assessment?
- How does your approach support students to harness the benefits of AI in their learning?
- How does your approach support the AI literacy required by the labour market without displacing opportunities to develop deep skills and acquire knowledge critical to a student’s discipline?
- How are you leveraging AI to enhance accessibility and promote equality of opportunity for students from all backgrounds?
- How are you using AI to streamline administrative processes, reduce staff workload, and drive efficiency?
- How does your approach to AI provide assurance to staff and students about data security and confidentiality?
- How does your approach to AI build trust by promoting transparency about how AI is used?
- How do you ensure that your AI policies are clear, accessible and well understood by staff and students?
Good guidance
In our discussions with students, clear and accessible guidance about acceptable AI use was their number one ask. A lack of clear guidance can lead to inconsistencies, confusion and anxiety, deterring uptake. Many individuals and institutions across the sector are taking steps towards addressing these concerns. We've seen examples of principles-based frameworks, of checklists for students to help ensure good academic practice, and of efforts to more fundamentally rethink assessment methods and pedagogy.
If you've seen or implemented effective AI policies, please let us know! You can share examples by emailing [email protected]. We're especially keen to hear from students.
What next?
There's a lot of uncertainty about the future of AI, and I won’t attempt any concrete predictions about the future of the technology. Within this uncertainty, at least three things seem likely: AI will get better, its use will become more prevalent, and its influence on higher education will increase. In this context, we encourage universities and colleges in England to experiment with how they can best support students, to try new and interesting things, and to be mindful of the risks as they do so.
Notes
- HEPI, 26 February 2025, Student Generative AI Survey 2025. 92 per cent of students use AI in some form – up from 66 per cent in 2024. 88 per cent have used generative AI (GenAI) for assessments, up from 53 per cent in 2024.
Comments
Report this comment
Are you sure you wish to report this comment?