The intersection of AI and Accessibility continues to be a frequent topic in our AI in Education community discussions and a key area of interest for the sector as whole. As such, we wanted to bring an overview of some of the developments we’ve written about over the last couple of years, our recommendations for institutions and how to get involved in these ongoing conversations.
So far
In 2023 we looked at some of the new ways generative AI was being used to enhance accessibility in education. Exploring the potential uses to personalise content, support reading and writing and improve the digital accessibility of resources. Then as things developed through 2024, we started to see generative AI being considered as an assistive technology itself, as well as its ability to enhance the capabilities of more longstanding assistive tech.
In our pilot projects, we’ve seen some of this potential realised. Jamworks for example, is a student-focused tool that combines proprietary speech recognition models and generative AI to capture and organise information from lectures. This can relieve the pressure of note-taking and address more specific needs such as providing real-time captioning for hearing-impaired students. In our pilot with two FE colleges, the platform was felt to be an effective tool for supporting students who find traditional lectures less accessible.
Additionally, in our recent pilot of LearnWise, a student support service, feedback highlighted several accessibility benefits for users compared to requesting support from traditional services. Students mentioned the perceived lack of judgment from an AI system, as well as the 24/7 availability of AI support. Our full pilot report on LearnWise is due to be released in the coming months.
Through engaging with current students and learners at our student forums over the last year, we know that they are increasingly aware of, and concerned about, equity, bias, and accessibility issues related to AI. Students are advocating for measures that address these challenges to ensure a safe and inclusive experience.
Meanwhile concerns around the impact of generative AI on academic integrity have caused significant anxiety for both students and staff. It has become clear there is a need to manage use of these tools in education to ensure they are implemented ethically and effectively. However, it is also key that disabled and neurodivergent users are able to benefit from the significant opportunities these tools can offer, and that they are not disadvantaged by restricted access to assistive technology that is AI-enabled. A key question raised has been what amount or intensity of generative AI use might be considered a reasonable adjustment.
The use of Grammarly emerged as a key example in this discussion due to its status as a well-established assistive tool with a large user base of students and learners. We explored the questions Grammarly raised for institutions and recommended some approaches. At the end of last year, we also had a look at their new Authorship feature which seeks to help students acknowledge their AI use.
We’re aware though that these issues expand beyond any one tool. Generative AI capabilities are widely available across a range of tools and are increasingly embedded in existing platforms. Importantly, these issues also impact the experience of disabled and neurodivergent staff and they should not be overlooked in these conversations either.
We see that these discussions are naturally linked to wider discussions around Generative AI use in the sector. Our continuing work in this area will be to see how we can support institutions in determining acceptable use guidance which prioritises inclusion and accessibility.
Recommendations
At this time, we must ensure that students and staff continue to be supported by technology. We also want to explore the opportunities AI can present for accessibility and inclusion.
We recommend that institutions seek to:
- Maintain user’s access to existing supportive tools and features
- Avoid restricting access to existing tools like Grammarly which have established use for assistance to neurodivergent and disabled users
- Generative features can generally be restricted or turned off for controlled assessments which can ease fears over academic integrity
2. Consider how uses for assistive purposes fit into institutional approaches to defining ‘acceptable use’ of AI tools
- Increase awareness among staff and students of the benefits of AI tools for accessibility and make opportunities for safe exploration of appropriate tools and services
- Involve your neurodivergent and disabled users and disability services when piloting new AI tools and when creating guidance or policy around AI use
Get involved in the discussion
We invite everyone working in this area to join Jisc’s Accessibility and Assistive Technology communities to continue these important discussions.
You can also help by sharing experiences of uses of generative AI for accessibility by submitting your examples to our Generative AI in Practice Hub. We showcase real examples of generative AI use in education which are submitted directly by practitioners. These help to widen the understanding of generative AI uses and share good practice across the sector.
If you have an example of using generative AI to support accessibility which you might like to share through the hub, please let us know through this form.
Find out more by visiting our Artificial Intelligence page to view publications and resources, join us for events and discover what AI has to offer through our range of interactive online demos.
Join our AI in Education communities to stay up to date and engage with other members.
Get in touch with the team directly at AI@jisc.ac.uk