top of page
Writer's pictureKane Murdoch

Boiling frogs

Evening all,


I'm currently enjoying the tender mercies of the Sydney rail network as I type, so bear with me.


Also, apologies to subscribers- I had a Homeric fat finger moment where, instead of saving a collection of silly thoughts involving a group of students smoking legal stuff from a bong on campus last week, I cleverly clicked publish instead, then immediately unpublished it. You're really seeing under the bonnet of the Murdoch F1 race car here folks.

Moving on, I haven't touched on generative AI that much, which may surprise some. The reason is mainly that I'm not an expert in that (not many people are despite proliferation of such), and I've been considering the implications, as they relate to my work. Unsurprisingly this takes some time.


So when I'm asked to be part of a group to discuss my institution's response, I don't want to leap in with a bunch of idiotic suggestions. But I do have thoughts (some idiotic) and I wanted to share them here.


From the outset it has struck me that this is not a problem that has a silver bullet solution. Turnitin, for any of its failings and the criticism it incurs, kinda does what it says on the box. I believe that the advent of Turnitin led to the radical shift toward, and explosive growth of, commercial contract cheating, but that's not what was intended. Despite the prevalence of plagiarism in assessment, the risk of Turnitin significantly reduces how often it happens.


But the mestasising of "misconduct" turned on the stove. The gas is burning, and higher ed institutions and staff are the frogs. So when people such as Thomas Lancaster and the late Robert Clarke and others started to find contract cheating occuring, the water was getting warmer. This is now approaching twenty years ago (sorry Thomas).


The work that I, and others, have been doing in the past decade is kind of like a thermometer. And we've been saying for a while, sadly to mostly deaf ears, that the water is scaldingly hot.


And now we have GenAI, and the lid is bouncing around on the saucepan and water is shooting all over the stove. However, there seems to be a growing awareness of the frogs. They're fucking boiling and maybe wishing subject level academics good luck as they too boil is not a great idea?


Which brings me back to the institutional response bit. I'm glad there's some awareness, but I would also like everyone to know how badly the frogs are crying. I also want people to know that there is no magic frog cooling spray, delivered by our friends in "Ed tech", that will cool the frogs. We will not be able to misconduct our way out of this problem. No amount of breaches will dissuade students, or even staff, from using GenAI.


So one of my suggestions to this discussion was that in the absence of broader program level changes, the vivas I've suggested should be used to confirm marks. If a student can't defend their work, they don't get marks. It's imperfect from a workload perspective, and could certainly be considered in terms of inclusiveness. But in other words, we need to focus on assessment policy rather than academic integrity policies. At this point the latter is a distraction that won't solve anything. Just this once the "Cop shit" people are right. A punitive response will not only not fix the problem, it won't even make a dent, but will cost us vast energy to prosecute.


In due course I'll have some more thoughts I'll share, but one has me pooped today, and I'm nearly at my station.


Until next time,

KM

86 views2 comments

Recent Posts

See All

2 Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
Josh Andrews
Josh Andrews
Jun 20, 2023

It seems pretty clear to me — as a layperson, and also as a huge idiot— that generative AI has rendered undergraduate written assessment functionally impossible to run with any sort of integrity guarantee.


The solution would appear to be the return of handwritten exams, practical tests, and oral assessments — vivas perhaps, but also presentations and debates. Is the best way to hold back the generative AI tide not to simply rely on the things that a student can’t generate their way out of?


At least we pretty much know how to catch cheaters at their one-person desks in a handwritten, invigilated exam hall…

Like
Kane Murdoch
Kane Murdoch
Jun 20, 2023
Replying to

In general I agree, but I think everyone would be surprised how often impersonators sit exams for students. Including me. Exams are not the secure assessment everyone thinks they are.

Like
Post: Blog2_Post
bottom of page