In our latest think piece, GuildHE's Policy Programmes Officer Omoye Otoide explores the case for sector-wide collaboration in AI, how vocational and specialist institutions add value, and how the sector can shape this vital conversation together.
Artificial intelligence (AI) is no longer an emerging trend; it is a defining force in the future of higher education and how institutions deliver education. From generative tools like ChatGPT to adaptive learning platforms, AI is transforming the way we teach, learn, and manage educational spaces. Whilst innovation moves quickly, national policy is still finding its footing. Vocational and specialist institutions are experimenting, adapting, and leading in ways that often go unnoticed.
However, without a shared regulatory framework or sector-wide coordination, these efforts risk becoming inefficient. To truly shape how AI is implemented in higher education, policymakers and regulators must provide not only clear guidance, but also create the space for genuine collaboration. Strategic engagement with the full diversity of institutions is essential to ensure the sector can respond to AI responsibly, not reactively, and co-create policy that reflects real world practice.
AI presents both opportunity and risk. Left unexamined, its rise could widen existing inequalities in access, outcomes, and representation – particularly for students from underrepresented or disadvantaged backgrounds. Recent guidance from the Department for Education and legislative efforts such as The Artificial Intelligence (Regulation) Bill [HL] (2023), show growing awareness of AI’s impact on higher education. But at present, higher education institutions are often left to interpret these policies and regulations independently with little clarity around what good practice actually looks like and who is responsible for defining it.
The lack of clear national regulations has created unevenness in how AI is being implemented across the sector. Smaller institutions, often working with fewer resources, are left to navigate this space without consistent guidance. Without targeted support, there is a risk that future policy will reflect the priorities of larger institutions and not the full diversity of the sector. A more inclusive approach requires shared principles, practical tools, and real-world insight that these institutions can offer.
Ultimately, this is a turning point for government and regulators to collaborate with the sector on the conditions for safe, ethical, and inclusive use of AI and move from advisory guidance to coordinated strategy with vocational and specialist institutions in mind. An approach that evolves in step with real-world skills and practice could encourage innovation that protects what makes specialist education distinctive.
Vocational and specialist higher education institutions have several strengths that can enrich national conversations on AI. These institutions are often more agile, closely connected to their communities, and rooted in real-world learning. This puts them in a strong position to pilot AI in their disciplines. For example, some institutions may use AI to provide faster feedback, create more personalised learning plans, or support students with additional needs. In creative subjects, AI is being tested to help brainstorm ideas or experiment with new styles. In healthcare education, AI might support training simulations or assist with managing large volumes of information.
One example of this innovation can be seen in one of our members, Arts University Bournemouth (AUB). AUB launched a short course in 3D Modelling and Visualisation using AI, helping students develop hands-on skills with generative tools shaping the future of design. AUB also hosted a generative AI exhibition featuring the world’s first digital supermodel, exploring the intersection of technology, identity, and creativity. These initiatives show how specialist institutions are not just keeping up with AI developments, but are innovating in ways that reflect their unique strengths.
These examples show how AI can be a useful tool to enhance teaching and learning with the right support in place. However, without clear guidance or investment from government or regulators, these same tools can also create challenges. In areas such as the creative arts, generative AI challenges longstanding understandings of authorship and originality. Without protection for creative intellectual property (IP), these disciplines risk losing the core practices and values that define them. Similarly, in professions rooted in trust and human judgement such as nursing or teaching, the use of AI could undermine core values.
In parallel, guidance from the QAA on Generative AI offers a useful starting point for institutions seeking to embed AI in curriculum design and assessment without compromising academic integrity. These resources explore how AI can be harnessed positively while maintaining standards, and highlight the risks of undermining core educational values if implementation is rushed or unregulated.
Without clear direction from government and regulators, higher education institutions may hesitate to experiment and innovate with their teaching methods. Others might move too quickly without enough reflection. This risks creating a two-tier system in which some providers progress with few guardrails, whilst others are left behind.
AI use should not be seen as the sole responsibility of individual institutions, but as a shared challenge across the entire sector. Addressing it requires coordinated action, clear policy direction, and a supportive environment created by government and regulators. This means investing in staff development, ensuring equal access to AI tools, and knowledge sharing across institutions. With stronger collaboration and the right foundations in place, the higher education sector can contribute to shaping ethical, inclusive, and student-centred AI use.
As AI becomes more embedded in the student experience, it is crucial to consider the emotional and psychological implications. For some, AI reduces workload pressure and enhances accessibility. But for others, it creates confusion, anxiety around authenticity, and fear of surveillance. Recent findings from Jisc’s 2025 Report on Student Perceptions of AI highlight that many students are anxious about the speed of AI developments and want clearer guidance and support from their institutions. Some students reported using AI for relationship and mental health advice.
These insights make clear that student wellbeing cannot be an afterthought in AI adoption and that individual institutions cannot be expected to navigate this alone. What is needed is a coordinated response in which the government, regulators, and the sector work together to shape national guidance that reflects real student needs, institutional diversity, and ethical responsibilities. This will not only protect students but also encourage institutions of all sizes to adopt AI responsibly.
AI is reshaping higher education in real time. Higher education institutions cannot afford to be passive observers. Institutions of all sizes, including smaller and specialist providers, are uniquely positioned to contribute to an ethical, inclusive, and values-driven implementation of AI.
As Parliament explores what ethical AI governance should look like, GuildHE members must seize this moment to shape the narrative – bringing forward real-world insights from their institutions to shape regulation that reflects sector diversity, student need, and innovation.
To truly seize the moment, vocational and specialist institutions need to be actively engaged as national guidance is developed. Collaboration must go beyond consultation but should include co-designing practical standards with HE institutions, and embedding the perspectives of educators, students, and sector leaders into policy development. Policymakers must also ensure the regulatory environment encourages innovation, safeguards IP, and protects the strengths of diverse institutions.
This is a call not just to be heard, but to co-create. By collaborating with government and regulators, vocational and specialist institutions can help design a future where AI use in higher education is not only ethical and inclusive, but distinctly shaped by the diversity of the sector.
As AI continues to evolve, GuildHE is keen to understand how members are engaging with it across learning, teaching, and wider business operations. If your institution is developing innovative practices, facing implementation challenges, or simply exploring the role of AI, please email [email protected]. Your insights will help us shape targeted support and influence future policy work across the sector.