Get Your S*** Together: How Students Should Use AI in University
This one’s for all the students who have ChatGPT on speed dial. NUS lecturer Dr. Shobha Avadhani answers our questions on how students should actually be using AI in the classroom, without short-changing their learning process.
By Kenme Lam EJ,
There’s no denying it: AI, especially large language models (LLMs) like ChatGPT, has fundamentally reshaped the way students think, learn, and complete assignments. In a university setting, you’d be hard-pressed to find someone who hasn’t used it to brainstorm ideas, clean up their grammar, or in some cases, generate entire pieces of work.
But as AI becomes increasingly embedded in academic life, the question isn’t whether students are using it, but how.
We speak with Dr. Shobha Avadhani, Senior Lecturer at the Department of Communication and New Media, National University of Singapore (NUS), where she teaches public speaking and media studies. She shares her perspective on what meaningful, responsible AI use in the classroom actually looks like – and why the conversation needs to go beyond policing its use to rethinking how learning is designed.
Dr. Avadhani is a Senior Lecturer at NUS’ Department of Communication and New Media.
What are some common mistakes students make when using AI in their academic work without realising it?
Dr. Avadhani: “From my observation, when students use AI in their writing, the reason they state for using it also points to the possibility that they can’t credibly evaluate the given output.
For example, if a student says they used AI to clean up the grammar, it usually means that they don’t have a sense of what is grammatical. They have to trust whatever their AI tool tells them. In general, though, I don’t see many mistakes.”
If a student refutes the use of AI in their work (say, in an essay or piece of code), but their style of writing seems highly similar to that of large language models like ChatGPT, how can educational institutions prove that AI was used?
Dr Avadhani: “I don’t think this can really be proven. There are some tools, but they are not foolproof. The first line of action, after proactively designing assessment for specific objectives and teaching students about academic integrity, is to have a conversation with the student in question. Usually, students who have actively worked through the process and not overly relied on AI are able to provide a credible account of their reasoning.”
From your experience, how can educators tell when a student is over-relying on AI at the expense of their own understanding?
Dr. Avadhani: “There are some ‘tells’, and these usually come through in the interstices of ideas and authorial voice. However, it is not always clear.
More importantly, we should move past the idea that the only problem is the ‘purity’ of the final product. What really matters is maintaining a space for students to learn what you want your course to teach them. If your task for them is so easily delegated to AI, then it is time to change the task.”
Dr. Avadhani at a conference in 2022.
In today’s classrooms, where do you see AI being appropriately used, and in what situations is it actually encouraged for students?
Dr. Avadhani: “I think it’s a bit early to answer this question, since we don’t really have solid research on this yet. It is not always the case that what is encouraged is what is appropriate, and many educators are still trying to figure out what works.
Even the notion of what works is not a simple one, because we have to take into account the pressures to use AI. Hence, it isn’t just about student learning per se. As with much of EdTech (you might know it as education technology, which refers to digital tools that improve students’ learning in the classroom), the starting point is not always about designing what is best for learning, but about finding an educational use case for a technology that was developed in another context.”
How aligned are university expectations around AI with what employers and industries expect? Many students have shared that in professional settings, they’re often encouraged – or even instructed – to use AI for certain tasks, from generating mock-ups to drafting written content.
Dr. Avadhani: “I think there is a major difference between university and industry. In the former case, the objective is to learn how to think. We care more about how they process than what they produce.
In the latter case, the objective is to generate profit for the company. The concern is about whether the product meets the requirements. So when it comes to education, I don’t think there needs to be complete alignment in this regard. Tools keep changing and workers learn how to use them when they need them. Students who learn how to think critically can participate fully as democratic citizens.”
As AI becomes more embedded in learning, what skills will become even more important for students to develop during their time in university?
Dr. Avadhani: “I don’t know how embedded it will become, nor what would be the mechanisms of that embedding, but skills of critical thinking, reflection, connection, and awareness of nuances in language are I think some of the significant areas of learning for students.”
Looking ahead, how do you see AI shaping the future of tertiary education — especially for Gen Z and the students after them?
Dr. Avadhani: “I am not sure. There are definitely a lot of possibilities, but for AI to be sustainable it would require more research into its economic and environmental impacts, among others. I don’t want to downplay how significantly it has disrupted education in some ways, but there are other ways in which it has led to a recommitment to and reclamation of core values such as trust and relationship-building in the classroom.
I also think we shouldn’t lose focus on some enduring problems which can become far more severe if we are not careful, such as inequality of access. Assuming that all students have access to the same tools can mean that we inadvertently weigh all assessments in favour of those who can afford paid subscriptions to better tools.
We should also be sensitive to the actual experiences that students have with AI, such as sexual harassment or psychological harm. Staying connected to students, prioritising conversations around ethics, and bringing students into the process of instructional design are some ways in which we can negotiate the tensions around AI.”
Get Your S*** Together (or GYST for short) is a new column that tackles the realities of adulting and figuring life out along the way.