Tools that utilise generative artificial intelligence have spread like wildfire. Joshua Wilhelm, in his column, expresses concern about the increasing, uncritical use of these tools among students. Drawing from personal experience, he proposes a solution: teaching comprehensive AI literacy. How should we do it?
Since the open access release of ChatGPT in November 2022, discussions about artificial intelligence (AI) and generative artificial intelligence (GenAI) have been omnipresent. Hardly a day goes by without new reports about GenAI, and everyone seems to have an opinion on the subject. And so do I, of course. Based on the finest of all data: personal experience.
Universities – and this is no secret – are under particular pressure in the current AI debate. On the one hand, they have to prevent students from cheating, but on the other hand, they also have to prepare them for conducting research or working life with the most up-to-date knowledge possible.
To my great surprise, many universities and lecturers still rely on categorical bans. In my seminars at the University of Hamburg and the University of Münster in Germany, I take a different approach.
SINCE THE BEGINNING of 2023, I have been teaching AI in adult education. Almost all students have tried ChatGPT anyways and a significant proportion use it regularly for graded assignments – usually without permission. The reasons for this are plentiful and well-argued, and no one, absolutely no one, chooses this approach out of sheer laziness – at least according to the students. Whatever the reasons, almost none of them have familiarised themselves with how GenAI works, which ultimately leads to an adaptation of the texts without reflection.
Many students use GenAI for graded assignments – usually without permission.
In my seminars, I therefore try to teach and add a critical and questioning dimension to the functional-pragmatic competence – which most people already possess through experimentation – to enable a holistic AI literacy.
The combination of theoretical knowledge and practical work has proven to be the best practice. Both fields are then based on the students’ current part-time job or the job in which they would like to work after graduation, in order to create a connection that is as close as possible to their needs and interests. These fields include human resources development and coaching, for example.
The result of this combination was relevant for both bachelor’s and master’s students: After the students heard how ChatGPT works in the first session and were able to successfully work out biases under guidance in the second session, a switch flipped.
On the one hand, they analysed the output of ChatGPT extremely critically from that point on, while at the same time this insight also triggered a strong motivation to learn more about AI. The students asked more questions and brought AI programmes into the seminar to be discussed. The view was directed beyond the universities, toward bigger questions about social significance.
I try to add a critical and questioning dimension into the functional-pragmatic competence to enhance holistic AI literacy.
After the “demystification” of ChatGPT and GenAI, students worked on a reflective attitude and working method – in each case without specifying what this should look like. The groups came to the same conclusion independently of each other.
GENAI I NOT (YET) suitable for content-related work, but it is very good for simplifying the writing process. For the future, I share Beth McMurtrie’s view that tools such as ChatGPT will become part of writing in the same way that calculators are part of maths, not least because the programmes are low-threshold and inexpensive, but also because they are still in their infancy.
Lecturers at all universities are therefore called upon to rethink if term papers are still justifiable. Work that focusses on the discussion of content rather than the written elaboration would strengthen the quality of university teaching and learning anyway.
Sophisticated didactic concepts and a strong connection to the needs and interests of the students naturally increase interest and participation. However, my practical experience shows time and again that neither a whole lot of content nor a lot of time is needed to change students’ attitudes from uncritical use of GenAI to critical and questioning.
Encouraged by this proof of concept, I see it as absolutely essential that universities do not blindly prohibit the use of GenAI, but rather that they become more committed to a critical approach and offer crash courses on AI literacy to all students.