您的浏览器禁用了JavaScript(一种计算机语言,用以实现您与网页的交互),请解除该禁用,或者联系我们。 [HEPI]:学生生成式人工智能调查 - 发现报告

学生生成式人工智能调查

信息技术 2025-02-15 - HEPI Aaron
报告封面

Josh Freeman Foreword Professor Janice Kay CBE, Director, Higher Futures It is a pleasure to introduce this 2025 study, a welcome repeat of the 2024 AI survey of how full-time undergraduatestudents are currently using AI tools. It shows that use has soared over the past year, demonstrating that AI toolsare used in varied ways in learning and assessment. It is a positive sign overall: many students have learned more about using tools effectively and ethically andthere is little evidence here that AI tools are being misused to cheat and play the system. Students see a rangeof benefits of using AI tools, from saving time to improving the quality of their work in ways they consider to bepersonalised, especially outside study hours. And yet, there are quite a lot of signs that will pose serious challenges for learners, teachers and institutions andthese will need to be addressed as higher education transforms. Policies on AI use for assessment are generallyclear but, at the same time, students are uncertain about what acceptable AI use looks like, with less than a thirdstating that their institution encourages them to use it and nearly a third reporting that their institution bansits use. Some students report that they are ‘being warned about the potential risks of AI, but [staff ] are activelyincorporating AI as a creative tool into some of their modules’. Students want more of the latter. They want more support in their courses to increase their skills in usingand managing AI tools, and they also perceive that while more staff are well-equipped to support them thanpreviously, this needs to improve substantially. Peppered through the study is clear evidence of a digital divide,whether women are using AI tools less and for less confident reasons or those with greater means are more ableto access premium products. There are gaps then for higher education institutions: how AI tools are used effectively to support students’learning and engagement, how students become better skilled, how staff are trained to have a deeper workingunderstanding of AI tools and how divides in the use of AI are not allowed to develop and persist. I urge younot only to mull through the data presented here but also to take time to reflect on the conclusions and policyrecommendations. I look forward to seeing what happens in the 2026 report. Executive summary Building on our 2024 AI Survey, we surveyed 1,041 full-time undergraduate students through Savanta about theiruse of generative artificial intelligence (GenAI) tools. In 2025, we find that the student use of AI has surged in the last year, with almost all students (92%) now using AI insome form, up from 66% in 2024, and some 88% having used GenAI for assessments, up from 53% in 2024. The mainuses of GenAI are explaining concepts, summarising articles and suggesting research ideas, but a significant numberof students – 18% – have included AI-generated text directly in their work. When asked why they use AI, students most often find it saves them time and improves the quality of their work. Themain factors putting them off using AI are the risk of being accused of academic misconduct and the fear of gettingfalse or biased results. Women are more worried about these factors than men, and men report more enthusiasmfor AI throughout the survey, as do wealthier students and those on STEM courses. The digital divide we identifiedin 2024 appears to have widened. Institutions have maintained a good record on protecting the integrity of assessments, with 80% agreeing theirinstitution has a clear AI policy and 76% saying their institution would spot the use of AI in assessed work – bothincreases from the 2024 Survey. However, while students overwhelmingly believe it is essential to have good AI skills,only 36% have received support from their institution to develop them. The gap has grown between the numbersaying they want AI tools to be provided and the number saying AI tools currently are provided. However, staffliteracy has increased, with 42% of students suggesting staff are ‘well-equipped’ to help them with AI, comparedwith just 18% in 2024. In new questions for 2025, we found that just under half (45%) of students had used AI while at school, and morestudents agree AI-generated content would get a good grade in their subject (40%) than disagree (34%). But theyare lukewarm about the possibility of exams assessed by AI: 34% would put in more effort against 29% who wouldput in less effort and 27% whose effort would not change. Based on these findings, we recommend that institutions keep their assessment practices under constant review,particularly as AI becomes more powerful and students become more proficient with AI tools, requiring staff to besupported to improve their AI literacy. However, institutions should not adopt a mainly punitive approach; instead,their AI policies should reflect that AI use by students is inevitable and often beneficial. Institutions should sharebest