We have spotted quite a few students using generative AI in their essays this summer and applied standard academic misconduct proceedings, though in most cases the work was so bad they would've failed anyway.

Today I learned of one whose use was sufficiently extensive that they will fail their degree.

I am wondering if this is *the first time a student has failed a whole degree for using AI*? Would love to hear about other cases. If you want to tell me in confidence, my Session ID is in my Bio

@tomstoneham
I had great luck turning this problem into a strength: I asked students to generate a bit of their paper using an AI, then critique its output. Details:
https://hachyderm.io/@inthehands/109479808455388578

Among other things, implicitly saying “the AI’s output is pretty much guaranteed to suck” sent a useful message about using it to cheat.

Paul Cantrell (@[email protected])

Attached: 3 images OK, trying an experiment with my Programming Languages class! • Have an AI generate some of your writing assignment. • Critique its output. Call BS on its BS. Assignment details in screenshots below. I’ll let you know how it goes. (Here are the links from the screenshots:) Raw AI Text: https://gist.github.com/pcantrell/7b68ce7c5b2e329543e2dadd6853be21 Comments on AI Text: https://gist.github.com/pcantrell/d51bc2d4257027a6b4c64c9010d42c32 (Better) Human Text https://gist.github.com/pcantrell/f363734336e6063f61e451e2658b50a6 #ai #chatgpt #education #writing #highered #swift #proglang

Hachyderm.io

@inthehands
We thought about doing that but didn't get time.

Personally I am happy for AI to be used as a tool (we don't ban spellcheckers and autocorrect or even proofreaders) so long as the student is taking the final decision about what goes in and what does not. Editorial responsibilit, as it were.

That is how law firms like Allen & Overy use it.

Of course, in many disciplines it is currently a pretty rubbish tool!