Artificial intelligence inserting doubt into the relationship between educators and learners

As the various responses are washing over us in education about the implications of artificial intelligence such as ChatGPT, I’ve thinking about its consequences for the relational aspects of education.

Just as deep fake video, AI generated images and even naturalistic voice platforms make us second-guess the veracity and provenance of what we are seeing or hearing, human-like text generation has inserted a doubt into our minds. The first doubt is of the educator of their own skills; can they discern what is student-generated work and what is not? The second is the more obvious question of whether the work they are spending time grading and giving feedback upon is the words, thoughts and accurate reflection of a human’s learning. In combination, these doubts therefore become present whenever a lecturer sets about the task of grading and/or giving feedback on student work: has artificial intelligence has been used or not? So the potential impact on students, who are putting in time and their original work, is their work is, by default, potentially being treated with distrust from the start.

Robot
Photo by Alex Knight on Pexels.com

This leads to the other area of doubt, which is on the learner’s part. They may doubt that their work or effort is being taken at face value as their own effort. Secondly, taken to the next logical level, they may doubt that any personalised feedback and grades they seemingly receive from a human educator may in fact have been generated by AI. This ‘weaponisation’ of AI can be by both sides looking for efficiency, or simply a crutch to prop up a lingering doubt that their own work is really any better than AI (yes, academics have imposter syndrome as much as students).

While I don’t fully subscribe to the thesis from Adrian Wallbank’s piece in The Times Higher that AI should be resisted and kept completely away from the classroom (good luck policing that), I agree that assessment should be used as a process for students to reflect on their learning:

“What I suggest ought to be assessed (and which helps us navigate some of the issues posed by ChatGPT) is a record of the student’s personal, but academically justified, reflections, arguments, philosophising and negotiations. Student work would then be a genuine, warts-and-all record of the process of learning, rather than the submission of a “performative” product or “right argument”, as one of the students in my research so aptly put it. This would enable our students to become better thinkers.”

Ben Thomson, the excellent technology journalist (another sector and profession that is having an existential moment of crisis about AI), also contributes a parent’s view of the education situation and says the new skills learners could develop are editing and verifying information. It’s not a bad point and perhaps an obvious end-point for the information abundance students live within now and in the future. Seeking out the human skills needed to work with AI-generated content and assessing those skills is a good way to go.

As I gather advice and resources for colleagues to help us mull over the short-term and long-term strategies we need to employ, I don’t think I can resist any longer the thought that this is a game-changing moment for education. In a YouTube video, from  in Charles Knight, he puts it well: the economic model upon which higher education has be operating – that is, the time-pressured systems of assessment for staff and students, relying often on precarious labour – has left itself vulnerable to gamification. I’d argue that gamification of that system is now in the hands of everyone, staff and students. Knight rightly calls that now is the time for appropriate resourcing of staff workloads to enable them to design assessments and time to grade them. I can only add to that, investing in people – those who teach and support learners – is more important than paying money for technologies to catch people out. As many before me have observed, the latter is an arms race that cannot be won. Teaching and learning is relational and it’s through prioritising that with time, money and status will higher education be equipped to deal with the doubt and distrust inserting itself between educators and students.

1 thought on “Artificial intelligence inserting doubt into the relationship between educators and learners

  1. Pingback: ChatGPT and education – David Hopkins / Education & Leadership

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s