Assessment practices and student behaviours surrounding Generative AI
Lead institution: The Open University
Country: England
Project primary contact: Mike Richards (mike.richards@open.ac.uk), Senior Lecturer
Project Title: Assessment practices and student behaviours surrounding Generative AI
The project is: In Progress
Project Summary:
We are running a number of projects in the area of academic practices regarding generative AI, particularly focussing on assessment. Our first project was a dual-anonymous “quality assurance” marking exercise across four end-of-module assessments across a university CS curriculum. Of the 90 scripts marked, every ChatGPT-generated script for the undergraduate modules received a passing grade (>40%), with all of the CS1 (introductory) scripts receiving a distinction (>85%). As such, we contribute an understanding of how the release of generative AI will impact quality assurance processes in HE.
In most cases, across a range of question formats, topics and study levels, ChatGPT is at least capable of producing adequate scripts in undergraduate modules. Our second focuses on the use of GenAI detection software. Using the TurnItIn AI detection tool, we examined 10,725 university student assessments submitted in two cohorts during the summers of 2022 and 2023. We observe an increase in the number of scripts flagged as containing AI-generated material and have analysed the demographic profile of flagged scripts. Our third ongoing project is exploring student coding practices using AI tools to solve challenging software development problems, akin to the Advent of Code. Our initial results will be coming out after Christmas.
Impact: To inform policy development based on empirical evidence of the capabilities of GenAI tools and student behaviours regarding these tools.
Audience: Higher Education institutes, staff and students.
This output is part of a member project - AI Garage: Creating the Future Now which collects and curates cutting-edge practice examples of AI. You can explore other submissions here.