As the internet – well the parts of the internet inhabited by developers – melts down over the release of GPT-4, the successor to GPT-3/3.5 which learns from images and videos, I’ve been thinking about the responsibilities we educators have to our learners. It’s a foregone conclusion that many workplaces and work practices will increasingly be augmented by AI tools. Curiously, it is not necessarily the promised land of automation of all those tedious jobs where a human has to look across multiple platforms, files or databases in order to assemble, update or generate something meaningful. (How many times a day do I copy and paste via Notepad to strip formatting?? Why can’t Windows remember the last folder I was working in??) So much for AI freeing up time, if anything, the cognitive and administrative load on educators is increasing because of AI.

So where should that limited energy and effort go?
My opinion is that it should go on helping students develop skills to understand, critique, and generally deal with AI platforms, or at least be prepared to engage with the (hopefully) regulated and ethically sound platforms of the future. I’m not for a moment saying that we should get all students onto ChatGPT now, but we need to start thoughtful discussions with them about AI. They themselves are best placed to ask the really difficult questions, like what is the carbon footprint of a ChatGPT search? What cis white Western, male biases are AI tools replicating? Can new knowledge be made or will we be eternally returning to a increasingly bland lowest common denominator of what we already know?
Prompt engineering is absolutely a skill that students should be developing. As is editing and refinement of the output of AI. Even more so is the skill of knowing when it is appropriate to use AI and when not to.
In addition, there is a skill we will all have to relearn; reading. Reading is no longer what it used to be now we know that what we are reading may be been generated by a non-thinking, predictive model. Skim reading, fast reading, knowing when to skip whole paragraphs or jump to the relevant bits will all be massively important when we a drowning in sea of content. And that’s just for text; images, video, audio are all going to have to be viewed, watched and listened to with circumspection. This places another layer of barriers for disabled students or students to are being taught in a language different to their first language; how does one skim read with a screen reader or when you need to live translate in your head as you go?
The internet fire hose of stuff is about to get an upgrade, and we all need to (wet)suit up. And this time, it will be using all the knowledge of how easily manipulated we are into outrage and spreading hate and lies. Digital media literacy, data literacy, science literacy; let’s throw them all in, because without these as graduate attributes, any idea of the university as a ‘good’ for society is left for dead.
Behind it all, there are some people and companies who will be making a lot of money. It’s never been more important to interrogate the ‘black box’ of a technology, especially as the debate rages on about whether developers can any longer see into the innards of the algorithms. Surely now is the time to start equipping ourselves and our students with critical digital and AI literacies?