AI Case Studies

Case Study: Blackboard Learn Ultra’s AI Design Assistant

Virtual Learning Environments (VLEs) – a ubiquitous tool in tertiary education – have begun to harness generative AI, leading to exciting new features and functionality. In this case study, we explore how University of Westminster and Northumbria University have been utilising Blackboard® Learn Ultra’s AI Design Assistant, which includes features that generate assessment rubrics, course structures, activity ideas, images, and quizzes and knowledge checks. 

University of Westminster 

We spoke to:  Professor Gunter Saunders, Associate Director Digital Engagement and Library Services; and Dr Doug Specht, Assistant Head of School 

Overview:  University of Westminster’s journey started with a consultation with Anthology (the company behind Blackboard Learn Ultra) on what AI features they and other customers would like to see Learn Ultra equipped with. Eager to use AI to aid educators in crafting authentic assessments, the Anthology client group, of which Westminster were a part, suggested that Anthology focus on AI features that could generate formative knowledge checks (e.g. quizzes). They also made the case for the ability to create assessment rubrics based on course content and learning objectives. These, Gunter and Doug explained, would help educators to communicate with students at scale the knowledge, understanding and skills that needed to be demonstrated in their work. That said, the client group were clear that AI’s role should be to develop a first draft, which educators would need to carefully critique and refine. 

When asked whether there were any aspects of AI the university wanted to avoid at this stage, Gunter and Doug noted that they were less inclined towards features that ceded too much control to AI, such as using it to mark students’ work without review from an educator. 

Anthology listened closely to the client group, and Westminster are now using AI generated knowledge checks and rubrics to support learning and assessment. As an example, Doug described the use of AI generated quizzes within a module designed to support students to write effective research proposals. A key criterion for assessment practices here was for students to be able to repeat different versions of these quizzes an unlimited number of times, enabling them to master the content and gain in confidence. This, Doug explained, would have been prohibitively difficult without the aid of generative AI, but was very much manageable due to the AI Design Assistant in Blackboard Learn Ultra.

Meanwhile, colleagues in both the School of Computing and Engineering and the Westminster School of Media and Communication have reported significant time savings through creating AI generated rubrics as first drafts. 

The university is also benefiting from an AI feature that generates authentic assessment tasks, enabling students to exhibit their abilities in realistic settings. For example, this tool has been used in Social Sciences to propose re-designed assessments that are more aligned with real-world scenarios and thus more resistant to the use of generative AI in producing a full answer. 


Further Insights:  As well as making use of Blackboard Learn Ultra’s AI features, the university is also benefiting from TeacherMatic, a generative AI solution that helps educators create a range of resources, including lesson plans, activity ideas, and schemes of work. 

At this stage in their AI journey, Gunter and Doug explained that they were glad to have multiple AI solutions to choose from; Learn Ultra’s AI features, for instance, benefit from being integrated within the VLE, whereas TeacherMatic offers a broad range of generative AI tools, and also benefits from being a standalone solution. In particular, they explained that the fact that TeacherMatic is not integrated into wider university systems means that users are often more comfortable with experimenting and honing their own skills in harnessing generative AI. 

Another point they raised was the importance of a positive culture surrounding an institution’s use of AI. They emphasised that the ideal situation would be for all stakeholders to be open about where AI is used (and how/why), but that this can be hard to achieve without clarity over which use cases are appropriate, and which are not: a line that is difficult to draw at this stage. Staff, for instance, may sometimes be unsure whether the use of AI constitutes taking shortcuts or not fulfilling their duties in full. Meanwhile, students continue to have concerns around how far AI can be used before questions of academic integrity arise. 

A key next step at the university, therefore, will be to foster discourse around the use of AI, so that stakeholders can voice their excitements, ideas and hesitations; thus shaping clear institution-wide frameworks for the responsible use of AI.  

In support of such frameworks, Anthology has put forward their own Trustworthy AI Approach, a set of principles and practices that enable users to benefit safely from artificial intelligence. Gunter and Doug highlighted this framework as a clear demonstration of Anthology’s values and of their commitment to responsible innovation.  


Thoughts on the future:  Gunter and Doug put forward an astute proposal for how VLEs could make further use of AI in the near future. Arguing that content in VLEs often feels quite siloed – it can sometimes be difficult to make links between different modules within the same course, for instance – they suggested that AI could be used to join the dots, so that learning materials build on each other in a holistic fashion.  

A teacher could, they foresaw, be using generative AI to draft a task assessing what students had learned in a particular lesson. But instead of focusing solely on the lesson in hand, the AI could suggest ways to incorporate learning from previous lessons into the task: an idea that is technically plausible, as the system would have access to the relevant content.  

In addition, Gunter and Doug see a promising future in the use of generative AI to personalise learning experiences and there is already evidence that some students are doing this themselves by essentially using generative AI as a ‘study buddy’. 

In thinking forward, Doug also considered that a pivotal question will be how the line is drawn between assistance/augmentation and automation. He mooted that one foreseeable future would involve students generating their own tasks, completing them, and having AI mark them: a scenario that could potentially cut the educator out of the loop entirely. 

That said, they remained optimistic that subject expertise, pedagogical skill and the ability to inspire learners and ignite their passions, remain key strengths that will maintain human educators’ enduring necessity. 

Northumbria University  

We spoke to: Lee Hall, Director of Academic Technology Services 

Overview:  Lee Hall is excited by the introduction of AI tools into Blackboard Learn Ultra, and he sees this innovation as a natural step in the evolution of VLEs, tools that from his perspective have always developed organically.  

Lee noted that VLE vendors have been highly responsive to the higher education sector’s needs in the past, which has led to the integration of critical workflows around assessment, content delivery, and in particular moderation. Presently, he feels that Anthology has shown discernment and responsibility, as well as innovation in their adoption of AI functionalities. Arguing that there has sometimes been a rush to integrate AI into the education sector reactively, Lee praises Anthology for focusing on use cases where AI can add real value.   

Northumbria University has recently started using the AI Design Assistance in Learn Ultra to generate formative assessment activities, using the in-built AI test generation tools; and they are also making use of the image generation tools. All academic staff now have the opportunity to use these features within their programmes of study, and the university has supplemented the roll-out with a range of supportive mechanisms such as help guides, 1:1 support and the release of an AI Hub which houses much of the policy development that the university is undertaking. 

Lee explained that the university is towards the beginning of its journey with Learn Ultra’s AI functionality, and are now actively deciding which AI tools to switch on next. 

Further insights: The ethical and social implications of AI are a big focus at Northumbria University. The university has launched an AI Hub, which provides a space for staff to express their ideas and views on AI’s use. It also has an AI Working Group, which steers the strategic implementation of AI projects, and has directly engaged student unions to foster a collaborative approach. Arising from these forums and interactions, the university has published a set of AI tenets – principles that underpin responsible use of AI. These tenets are captured in the image below.   

A colourful infographic showing Northumbria University's AI Tenets. At the top are three main horizontal bars of different colours, each representing a key principle, at the top is 'Student and Staff Safety' (orange) , then underneath 'Clear Ethical Guidance' (yellow), and underneath that 'Data Privacy and Security' (blue). Beneath these are two rows of green boxes containing related sub principles, sub principles on the top row relate to those on the bottom row. The first on the top row is Responsible Research, with 'Conscious of Social Impact', 'Human Oversight', and 'Public Good' underneath. Then, 'Education and Awareness Availability' with 'Regular Review' underneath. Finally 'Interdisciplinary Collaboration' is last on the top row with 'Industry and Policy Collaboration underneath'.
Figure 1 – Northumbria University’s AI Tenets   

Anthology’s attention to AI ethics has been a key attraction for the university. Lee cited their Trustworthy AI Approach, as an example of commendable practice, and noted that educators constantly feel in the driving seat while using AI on Blackboard Learn Ultra as they can easily control which features are used in which contexts.    


Thoughts on the future:  Lee foresees simpler interfaces centring around personal AI assistants being the direction of travel for VLEs. Instead of having to navigate through platforms themselves, users could simply declare what they want to achieve to their assistant, who would dutifully oblige. “It’s not too far removed from the old Windows paperclip”, noted Lee.  

He also discussed the innovation of educators having their own digital twins, bespoke solutions that are trained on an educator’s content (lesson plans, research papers, programmes of study, etc). These tools could be available to answer students’ queries, and provide academic support on-demand, thus freeing up staff to further develop their programmes of study, work on pastoral care and be generally more effective in terms of their time spent. 

When discussing the future of AI with Lee, the topic of ethics arose again. In the coming years, Lee hopes that some of the more prickly problems surrounding the development of AI models will assume greater prominence and start to be addressed. Environmental impact is high on this list, so too is the hardships that low paid workers often endure whilst tagging AI inputs.  

Thoughts and reflections

As a focal point for an institution’s digital ecosystem, VLEs are well-placed to host and integrate emerging AI capabilities. The insights from The University of Westminster and Northumbria University demonstrate that Anthology has made strong and grounded progress in making AI tools more widely available to the sector.  

A further takeaway was the need for clarity over what constitutes acceptable use of AI. Frameworks and guidance play a pivotal role not only in safeguarding against harm, but in giving organisations the confidence to innovate responsibly. Given this, Anthology’s Trustworthy Approach to AI deserves particular merit.  


By Tom Moule

Senior AI Specialist at The National Centre for AI in Tertiary Education

Leave a Reply

Your email address will not be published. Required fields are marked *