For many students, generative AI is no longer a novelty or an experiment. It’s already woven into the fabric of everyday university life.
This year’s Student Experience Forum (SEF), led by Queen Mary Students’ Union, set out to understand how national conversations about AI are playing out locally at Queen Mary. Through an online survey of 508 students and in‑depth interviews with 15 students — including those with disabilities or long‑term health conditions, students for whom English is not a first language, and students balancing significant paid work alongside their studies — a richer and more nuanced picture of student AI use begins to emerge at QM.
Vandy Widyalankara (final-year law student and Student Voice and Feedback Assistant) and Marianne Melsen (Head of Student Voice and Insights) who led the survey, share their preliminary findings, with the full findings to be shared more widely later in the year.
Explaining the importance and relevance of the project, Vandy said: “AI is a hot topic at QM and in wider society, but because it’s such a new topic, we have limited research to base our understanding on, and we often have to make assumptions about the ways students are using generative AI. With this project, we are hoping to give everyone a shared evidence base to build on when we design policies and support services going forward”.
AI as an everyday tool — but not for everyone
The headline figures from the survey suggest widespread adoption. Over 80% of respondents report using AI for academic or non-academic purposes. Among these students, AI use is frequent: 43% say they use AI for academic purposes at least weekly, and a further 25% use it daily.
Yet this is not a uniform picture of enthusiasm or confidence. Nearly one in five students (19.3%) say they don’t use AI at all. For most of them, this is not about access or curiosity, but values and trust. Ethical concerns (63.3%), worries about misinformation (59.2%), and fear of academic misconduct (35.7%) all feature strongly. For some students, opting out of AI use is a deliberate and principled choice.
Even among users, engagement is cautious. A quarter of respondents pay for access to AI tools, including premium versions, hinting at emerging inequalities between those who can afford more advanced tools and those relying on free, limited versions — an issue students themselves increasingly notice.
How students are using AI for learning
When students do use AI for academic purposes, their motivations are often pragmatic. The most common uses are to understand or explain difficult concepts (78.5%) and to research or summarise academic articles (56.1%). These practices suggest that many students experience AI less as a shortcut and more as a support — a way to make sense of complex material, manage cognitive load, or navigate unfamiliar disciplinary language.
More contentious forms of use exist, but are far less common. Around 29% say they use AI to write, edit, or improve assignments. Only a small minority report using AI for most of their assessed work: 3.7% say they use it for all or almost all assessments, and a further 5.1% for more than half. While these figures deserve attention, they complicate common assumptions that AI is being used wholesale to replace student work.
Instead, what comes through strongly — especially in interviews — is students’ careful line‑drawing. Many describe actively negotiating where AI “helps” versus where it might cross a boundary, often without clear guidance from the university to help them draw the line between productive usage and unacceptable usage.
Uncertainty shapes student behaviour
One of the most striking findings is how deeply uncertainty shapes students’ AI practices. Over a quarter of respondents (26.6%) say they have received no guidance from the University on using AI for academic work, while another 12.4% aren’t sure whether they’ve received any guidance at all.
This lack of clarity has consequences. Seven in ten students say they have avoided using AI because they weren’t sure what the university’s rules are. Unclear guidance is also one of the most frequently cited barriers preventing students from engaging with AI at all.
In interviews, students spoke candidly about second‑guessing themselves: Is this allowed? Do different lecturers think different things? Will using AI now cause problems later?
When asked what support they would value most, students were clear and practical. Over a third want help with effective prompting, while nearly 30% want clearer guidance on how to cite or acknowledge AI use. These point to students’ desire to use AI responsibly, transparently, and with academic integrity intact.
Beyond study: AI in students’ wider lives
Perhaps the most revealing findings sit beyond academic work altogether. Students are using AI as a companion for life as much as for learning.
For non‑academic purposes, AI is commonly used for physical health questions (36.1%), life admin (33.7%), and career planning (32.9%). These uses align with students juggling tight schedules, financial pressures, and uncertainty about futures beyond graduation.
More ethically complex are findings on wellbeing. Over a quarter of respondents (26.1%) report using AI for emotional support or stress management, and 11% for social interaction or companionship. Notably, more than a third of students say they are more likely to seek wellbeing support from AI than from university services.
Interviews suggest it is perceived as faster, less judgmental, always available, and easier to approach in moments of vulnerability — especially for students managing anxiety, disability, language barriers, or time constraints.
What this means for digital education
Taken together, these findings resist simple narratives of AI as either solution or threat. Students are neither blindly embracing nor entirely rejecting AI. Instead, they are experimenting, hesitating, adapting.
The message for digital education is not simply “use AI” or “don’t use AI,” but to meet students where they already are. That means clearer guidance, shared language, and space for dialogue about ethics, equity, wellbeing, and assessment.
Their further analysis of the collected data and dissemination of this study will follow – they are organising their annual Student Experience Forum in June, where the findings will be more formally shared.