LLM use is the most demoralizing problem I’ve faced as a college instructor.
https://arstechnica.com/science/2026/04/to-teach-in-the-time-of-chatgpt-is-to-know-pain/?utm_brand=arstechnica&utm_social-type=owned&utm_source=mastodon&utm_medium=social
Strangely enough, it seems like going backwards (really far back) is the way forward. Testing, even through the medium of writing, is much more “gamable” than an oral exam. It seems that Socrates guy was onto something.
A woman sits at a desk with a blackboard in the background. She has removed her glasses and is pinching the bridge of her nose.
Credit: EvgeniyShkolenko / Getty Images
@arstechnica It's so demoralizing to see my fellow students submit AI-generated text essentially with almost no edits on assignments and discussions.
Last week's discussion board about the war in Ukraine had 6 student posts to it. 5 of the posts looked really similar, talking about the exact same topic, using similar vocabulary and the first 3 sentences were almost identical, the entire posts almost certainly AI-generated.
@arstechnica
Quote:
One was complaining about an assignment they needed to do that night, and another incredulously asked why they wouldn’t just have ChatGPT do it. The first replied, “This is my major, I actually need to learn stuff in this class. I use AI for my other classes."
--
Schools are evaluated and accredited based in part on how credits from different disciplines count towards a degree. A Bachelor's degree requires not just depth in the major but breadth across disciplines.
This literally undermines the entire accreditation process, since the students are now choosing in which classes they will learn and in which classes they will perform pretend learning.