Categories
uncategorised

Generative AI: Lifeline for students or threat to traditional assessment?

Co-authored by Marieke Guy, Head of digital assessment, UCL and Chris Thomson, Jisc.

As generative AI embeds itself into the world, understanding students’ perspective on what it means for their learning and assessment is more important than ever.

We live in fast moving and concerning times. The leaders at our HE institutions have found themselves flung into a brave new world where the ground shifts almost daily. Students too are learning to navigate this new landscape. As well as disruption to assessment, we’re beginning to see how students are engaging with these new tools in ways that complement and challenge notions of what it means to study for a degree.

With this in mind, in March in the shadow of London’s Olympic Stadium, Kathy Armour, Vice-Provost (Education & Student Experience) at UCL brought these two groups together, Russell Group university leaders and a panel of students, to discuss the future of degree assessment.

A team from across UCL with some support from Jisc worked with the panel of students over a number of weeks to explore these emerging issues.

Our aim was to start forming a set of principles that universities and other institutions could follow as they think how to respond to the challenges of mainstream generative AI. We’d like to share what we learned from working with these students.

Staying relevant

Students are not waiting to be told what to do! We saw that AI is already an established part of legitimate scholarly practice for many students and they have a great deal of awareness of the complex situation facing tertiary education.

Their accounts made it clear that the genie is out of the bottle; AI is now so deeply integrated into their learning experience that it would be futile and counter-productive to resist the change. For many, AI has become a “lifechanging” educational companion, offering a level of support that is impossible to ignore. This could be in helping to get a quick summary of a lengthy article or to support international students translating an assignment written in their first language into English.

The students argued returning to traditional exam halls, engaging in an AI detection arms race or proscribing it outright would be detrimental to their future employability and wellbeing.

Our students want to be prepared for the wider world and future workplace. AI is becoming part of that reality so there is a need for institutions to engage with AI in learning, teaching and assessment to ensure validity, currency, authenticity and relevance. Students pointed to friends who were already creating AI-based start-ups.

Maintaining academic rigour

Whilst AI-enabled automation has benefits, it is important to ensure that important learning and developmental opportunities are not missed. Students should be encouraged to build academic skills in areas including study skills and the importance of digesting and critiquing information and the students recognized there was a delicate balance to strike here..

They encouraged us to be curious about the reasons why AI tools might be used by some to cheat. For many, said one, much of this comes down to high levels of stress, unrealistic workloads or a lack of support in study skills.

Providing clarity

Our panel was looking for clear guidance from their institution about where and how AI tools are being used to support teachers in marking and feedback. Their views on this were sophisticated, recognising the benefits to teachers of reduced workload and stress through automated processes and how that could lead to improved relationship building and contact time. They balanced this with concerns about how transparent these emerging tools might be and how certain aspects of feedback and marking were better suited to AI-enhancement while others involving higher levels of learning needed the human touch.

They also wanted clarity on the how their use of AI to support essay writing, assuming it’s properly acknowledged and referenced, might affect marking. A few asked, might a student submitting a piece of work without using AI assistance be marked higher or lower than someone who did use AI to produce a better quality result? These are complex ethical questions with no clear answer yet.

Ensuring fairness

Students are extremely concerned about fairness in assessment and marking where AI tools are used. A particular area of concern was how a lack of access to core AI tools would disproportionately affect already disadvantaged students. The panel discussed the potential for a widening gap between those who can afford AI tools and those who cannot. They raised the question of whether universities should provide paid-for versions of AI tools as part of their standard IT provision.

The student panel mentioned in the article listening to Kathy Armour give a summary of the discussion
Student panel on AI and assessment facilitated by Chris Thomson, Jisc. The panel summary was provided by Kathy Armour, Vice-Provost (Education & Student Experience) at UCL

Enhancing relationships

We’re being presented with a significant opportunity for open dialogue with students about the purpose of education and assessment in relation to AI but also in the broader sense. This thought-provoking event demonstrated the importance of engaging in open dialogue with students about the role of AI in education and assessment.

As Kathy Armour noted in her closing remarks, the challenges posed by AI and assessment are not new; they are rooted in longstanding issues of assessment and curriculum design that continue to challenge the sector.

Embracing the potential of AI in education can offer a lifeline to students, but it requires a delicate balance between technological innovation and maintaining the integrity of traditional learning experiences.

We feel by working together, students and educators can create a path forward that incorporates AI in a way that benefits all.

Thanks go to those involved in this work:

Students:

  • Matthew Banner – Postgraduate in the third year of a PhD In Biochemical Engineering, leading on a student-led partnership project considering assessment design and AI.
  • Sophie Bush – Undergraduate student on History and the Philosophy of Science BSc and lead course rep for Science and Technology studies.
  • Megan Fisher – Second-year undergraduate student studying Economics, with chosen modules in Environmental Economics and Algebra.
  • Rachel Lam – First-year undergraduate law student, serves as a student partner on the assessment design and quality review team.
  • Jennifer Seon – In last year of my part-time master’s programme studying Education and Technology, dissertation will focus on collaborative problem-solving in assessment. Recently interviewed AI expert Wayne Holmes for a podcast with the UCL AI Society.
  • Bernice Yeo – Postgraduate student taking the MA in Education and Technology. Works as an examiner for the International Baccalaureate.
  • Sopio Zhgenti – Postgraduate student studying Education and Technology at the Institute of Education with special interest in Artificial Intelligence.

Staff:

  • Marieke Guy (Head of Digital Assessment), UCL
  • Zak Liddell (Director of Education & Student Experience, MAPS), UCL
  • Joanne Moles (Head of Assessment Delivery and Platforms), UCL
  • Jennifer Griffiths (Associate Director in the UCL Arena Centre for Research-based Education), UCL
  • Lizzie Vinton (Assessment Regulations and Governance Manager, Academic Services) , UCL
  • Chris Thomson (Programme lead for teaching, learning and assessment), Jisc

The event also featured visionary case-studies from sector-experts on AI: Sue Attewell, Head of edtech and lead at Jisc’s AITeam, Professor Mike Sharples from the Institute of Educational Technology at the Open University and Michael Veale, Associate Professor and Deputy Vice Dean (Education) in the Faculty of Laws at UCL.

This blog post was edited with the assistance of ChatGPT4.

By Marieke Guy

Head of digital assessment, UCL

Leave a Reply

Your email address will not be published. Required fields are marked *