Categories
Examples of AI in education

Navigating the Future: Higher Education policies and guidance on generative AI

In our previous blog, we highlighted the transformative potential of generative AI in Higher Education (HE) and called for examples of AI policies and guidance from the HE sector. The response has been fantastic, and we are excited to share more examples from our community. We hope that these will help institutions develop thoughtful, ethical, and effective AI guidelines by learning from best practices. 

Here is a list of the guidelines and policies we have received so far. We have included a short summary for each which were created with the help of Copilot for Web:

For staff

A resource for Imperial’s education community to stay updated on generative AI developments in learning environments. There are links to regularly updated resources, including an ‘Introduction to AI’, a ‘Teaching Toolkit’, ‘Case Studies’, ‘Training and events’, ‘Policies and Ethics’, along with ‘FAQs’.

This guidance aims to support the adoption and integration of generative AI at different institutional levels – macro (university), meso (department, programme, module), and micro (individual lecturers, especially those with assessment roles). There is also guidance for staff on adapting assessment approaches and integrating AI into the curriculum.

The document includes guidance provided for learning and teaching staff, support staff and researchers (LTR staff) across UHI to support the effective use of generative AI for activities associated with teaching and research, in ways that support and enhance practice. It includes University approved generative AI tools, allows staff to decide how students use it for assessment and gives examples of how staff can use it appropriately.

This guidance from the Communications team covers guidelines to support teams in using generative AI tools safely, ethically and effectively. It includes opportunities and risks, usage guidelines, ethical considerations and training and transparency.

These guidelines have been developed to support staff undertaking teaching, research or administrative tasks on behalf of the University of Leeds. It also extends to external lecturers, assessors and PGRs who have a temporary or part-time staff role. Responsibilities, academic integrity, transparency and data rules are all included.

This guidance has been put together to support staff in using generative AI in student education. It covers assessment categories, writing a ‘Gen AI statement’ for assessment, case studies and guidance and resources to support a discussion with students.

This guidance, from Australia, outlines directions for the future of assessment. It seeks to provide guidance for the sector on ways assessment practices can take advantage of the opportunities, and manage the risks, of AI, specifically generative AI.

This guidance is specifically for academics on integrating generative AI in assessment. It covers an integration process, AI and assessment framework and examples of when the use of genAI may be considered unacceptable. 

For students

A resource for Imperial’s student community to stay updated on generative AI developments and their applications in education. It highlights the prominence of models like ChatGPT and Copilot, emphasising their potential in everyday use and educational enhancement. There is guidance provided on using generative AI responsibly and effectively, including referencing AI sources in academic work.

This guidance for students covers a definition of AI and generative AI, academic integrity and provides examples of how it can be used with studies. There is also support and guidance to help students use AI tools appropriately, acknowledge its use in academic work and develop critical AI literacy skills.

This resource is intended to introduce students to generative AI in the context of their university studies. It works through how generative AI works, ethical concerns and academic misconduct, along with how students should critically evaluate outputs, apply it to their studies responsibly and appropriately, as well as recognise the differences between original and AI-generated content.

This guidance aims to help students understand the university’s position on AI tools and their use. It covers limitations, how to use it responsibly, referencing and examples of when students may use it. This guidance includes videos along with text.

The document includes guidance provided for students across UHI to support their effective use of generative AI for their learning and coursework in ways that support and enhance practice while also meeting the key considerations outlined. It includes when it can be used, ethical uses and examples of good practice.

This micro-course created by Durham University explores how and whether these AI tools can be used in learning, module assessments, and exams, the risks and ethical issues associated with it and prompt engineering.

This guide helps students gain an introduction to the potential benefits of some of the emerging generative AI tools. It covers a range of uses, including how to use it as a study buddy and how to acknowledge their use of the tools to avoid plagiarism issues.  

This page details student specific resources, including templates for acknowledging and a checklist reflecting on the nature of generative AI use.

For staff and students

This guidance supports the use of GenAI tools by staff and students to enhance critical AI literacy, with guidelines on permitted uses and acknowledgment in assessments. It includes the risks associated with its use, guidance on assessments and an AI glossary.

This hub brings together all the latest information, resources and guidance on using generative AI in education. There are links to an ‘Introduction to generative AI’, how to use the tools in assessments and video guides on ‘designing assessment for an AI-enabled world’.

This policy governs the use of GAI tools to enhance learning, teaching, research, and professional duties, emphasising ethics, privacy, and legislation. It applies to all learning and teaching staff, support staff, researchers and students.

These principles guide the use of generative AI in learning and teaching at the University of Sheffield. They cover a positive approach to AI, limitations, equity of access, ethical use of the tools and academic integrity.

This hub outlines the University’s position on the use of generative AI and shares relevant guidance, resources, and communications on this subject. It includes their position on the use of AI tools, guidance for staff, guidance for students and further resources.

This policy provides a University-level framework, for staff and students for how and where it is appropriate to utilise AI for learning, teaching and assessment activities.

This guidance covers the University’s commitment to the use of generative AI, along with the expectations of staff and students. Specific guidelines are provided for the use of Generative AI in assessments, ranging from research support to creative material creation, with clear instructions on what is allowed for each assessment (see this link for a more detailed guide on assessment: University of Wales Trinity Saint David AI and assessment guidance).

The University of Limerick have published five principles on generative AI which are applicable to academics, professional support staff, researchers and students. Along with this, the Academic Integrity Unit’s GenAI webpage provides information for staff and students.

This toolkit was developed as part of the SATLE funded (AI)2ed: Academic Integrity and Artificial Intelligence project at University College Cork, which paired students and academic staff to explore the responsible use of GenAI in higher education. The toolkit offers guidance using GenAI responsibly, including information on academic integrity and AI literacy, and also contains practice case studies of assessment tasks from across disciplines that either integrate GenAI or mitigate against its misuse.

These examples demonstrate a wide range of approaches to AI policy and guidance, reflecting the unique needs and contexts of different institutions. Thank you so much to all those who have shared their policies and guidance so far, we really appreciate your support and your willingness to share with the wider education community. 

How to contribute 

We continue to seek contributions from HE institutions that have developed or are in the process of developing AI policies and guidelines. If you have any documents or resources that you are willing to share, please send them to AI@jisc.ac.uk. We will collate these resources and add to this blog post. 

Thank you for your input and expertise. We look forward to receiving more contributions and continuing this important conversation. 

Feel free to share this call to action within your networks and encourage others to participate. 


Find out more by visiting our Artificial Intelligence page to view publications and resources, join us for events and discover what AI has to offer through our range of interactive online demos.

For regular updates from the team sign up to our mailing list.

Get in touch with the team directly at AI@jisc.ac.uk

Leave a Reply

Your email address will not be published. Required fields are marked *