can't remember where I saw it but "Using AI in education is like using a forklift in the gym. The weights do not actually need to be moved from place to place. That is not the work. The work is what happens within you" is a solid quote

Mine for AI in general is "chess does not need to be played."

@barquq

@barquq as a kid I understood the usefulness of pumping iron way more than I understood the function of homework.
Real Genius Classroom montage (1985)

Just the classroom clips from Real Genius (1985).

YouTube
@barquq
Excellent quote. I feel like this also applies to most situations involving using AI to produce code: the idea isn't just to produce code, it's to produce developers who understand the system they're building.
@3TomatoesShort @barquq So many bosses just want to lead code factories though. Why care if the product is mediocre, it makes money!
@ozzelot
I spent my whole (14 year) career in the one government agency, mostly on the same team, so I was completely spoilt in not having to deal with those kinds of problems much 😅
@barquq

@barquq
Yes, absolutely. But on the other hand this is unfortunately not the (only/main) reason to get an education nowadays

https://bsky.app/profile/youngvulgarian.marieleconte.com/post/3lmtnpwcrk22n

Marie Le Conte (@youngvulgarian.marieleconte.com)

so I do absolutely, entirely agree with this but it also does feel like the logical endpoint of treating university as "a thing you have to do for three years otherwise you won't get a job" - if you're not making a case for why a uni education is useful, don't expect people to actually respect it [contains quote post or other embedded content]

Bluesky Social

@barquq
I get the ask.

If I may ask, didn´t ppl say the same about calculators, programmable calculators and the possibility to hand in essays written on a typewriter or computer instead of by hand?

@littledetritus @barquq Calculators are usually not encouraged while kids are learning arithmetic, nor keyboards when teaching handwriting; the use of an aid should not overshadow the skill being taught.

Even then, keyboards merely accelerate writing, they don't replace the writer's need to think, compose, and write. Calculators can replace the operator for simple tasks, but again: if you're using the tool to replace the learner or teacher, nothing gets learned.

@seachaint @littledetritus @barquq Also, before you learn to use a calculator, you are typically learning specific calculations so that you can both input them into a calculator, but also validate that the answer looks correct (i.e. "Wait, this looks to be off by a magnitude in estimation - did I give it bad values off by a factor of 10?", or "Wait - if there are only 360 degrees in a circle, why is my answer 450 degrees?".).

@Oldfartrant @seachaint @littledetritus @barquq I mean, the "Proper" answer depends on the context - for example:

1.) Either a 90 degree move in the opposite direction;

2.) Your calculation used a bad value that ended up applying too much of a given value.

3.) You converted from Radians incorrectly.

@seachaint @littledetritus @barquq and calculators produce predictable and accurate results
@littledetritus @barquq for the error rate AI has alone, that is not comparable in anyway. A calculator is pretty damn reliable. It just does the numbers you punch into it. You still have to know what to tell it.

If you ask chatgpt to calculate the rpm for turning a kind of steel you have to pray it even picked the right formulas and charts.

A wrong number can mean thousands in damages.
@anarceus @littledetritus @barquq or even does the calculations in the first place instead of just giving you a number that sounds correct (since it's just predicting words, and not actually computing anything)
actually even trusting a word predictor to be able to do math at all is a horrible idea, since it *won't*
@soupermkc @littledetritus @barquq Really good point! At my apprenticeship school they showed us the usefulness of AI for calculating things (the specific AI they have shows the process or something) and it immediately used wrong data lol
@littledetritus @barquq Which is true. The real work is to do math in your head.
Working with people who don't have basic top-of-head math understanding is a nightmare... I regularly get to review docs that are off by orders of magnitude AFTER multiple reviews. By people who presumably relied on calculators their whole educational career
@littledetritus @barquq * calculators have you skip the part of doing mental arithmetic -- but you still have to figure out and execute math algorithms
* programmable calcluators have you skip the part of executing math algorithms -- but you still have to figure out math algorithms and know how to program them
* typewriters have you skip handwriting -- but you still have to learn how spell and put sentences together
* word processors have you skip spelling -- you still have to put ideas together

@littledetritus @barquq so yes, with every tool there's a part you're no longer learning, and a part you still have to do, which is the tool input.

the question for AI then becomes: what is the part that you are no longer learning, and isn't that precisely what you're going to school to learn?

Especially because unlike a calcluator or a spellchecker, AI doesn't give the student a right answer. How can they ever learn how to tell the AI is wrong?

@littledetritus @barquq
Of calculators, not word processors. The problem with people who have only ever used calculators since middle school is that they're used to having either an answer or no answer, they don't have an instinctive feel for "this sounds plausible but doesn't feel right," because they don't know the landscape of the calculation.

@barquq Possible source:

https://www.newyorker.com/culture/the-weekend-essay/why-ai-isnt-going-to-make-art

Without paywall: https://archive.ph/HEYBn

“Using ChatGPT to complete assignments is like bringing a forklift into the weight room”

Why A.I. Isn’t Going to Make Art

Ted Chiang on how artificial intelligence still isn’t as intelligent as it is perceived to be and how its profound limitations should temper our fears about it replacing real art-making.

The New Yorker

@texttheater @barquq

And it's a broken forklift that can drop a weight on you at any time.

@texttheater @barquq Ted Chiang!! My man!!!

(Non-science-fiction-nerds may be familiar with a little film called "Arrival" based on one of his stories)

@barquq @lisamelton that’s good. Gonna steal that
@barquq In my experience, AI is a forklift that puts the weights in the wrong place. Honestly, even at the level of autocorrect, it is worse than useless.
@barquq I feel that since students will use it whether it is taught or not, it is better that its weaknesses as well as the few things it can do well are taught — rather than let the students believe they get the answer/the truth from an AI. They will run into it later regardless, and they should be aware what it CANNOT do.
@barquq Phrased in a more grim way, it's like helping a butterfly out of its cocoon. Does not leave it equipped to survive the environment it is making its way into.

@barquq @lisamelton

I agree. AND, it’s also like using an e-bike for fitness: opens up the activity to a lot of people, and makes it a lot more enjoyable, and you actually do get quite fit and good at riding a bike!

@barquq some geezer on Bluesky is getting numbers from it, but it's actually adapted from Ted Chiang here https://archive.is/XGKUd
@barquq I think it was a New Yorker article
@barquq great, if only all people working in Education would believe that too… for all the mindless bullshit work I had to do back in school, I would absolutely used any cheat I could if it was invented already. the goal wasnt often to really learn something unfortunately, and from what I hear this hasnt changed much.

@B3r6ur @barquq Yep, that's a different side of the issue... As long as the education is goal-oriented, using cheats and shortcuts makes too much sense. And if the "cheat engine" is a gross slop machine - you're fighting fire with fire.

Really, we need both to reform the education system and to drive shit like LLMs into the mud.

@barquq ...until your scholarship renewal depends on your grades and your part-time job and childcare have left you ragged. I can't find it in myself to blame people caught in a deeply unfair system for using these tools any more than I can blame them for driving cars when there's no public transit.
@barquq
That’s a good analogy, but I’d say it’s not even as good as using a forklift in the gym, because you can at least trust the forklift to move weights, in a way you can’t trust “AI” to do what’s being asked of it.

@barquq

I really like that.

I found the analogy in this video - https://www.youtube.com/watch?v=08NuUbcgT9Q - and on this semi-pseudonymous blog - https://theeffortfuleducator.com/2025/03/18/a-classroom-teachers-take-on-ai/

Neither is as quotable as your version, though.

That's a good one.

How Students Writing with AI is Like Using a Forklift at the Gym

YouTube

@barquq wow.

Using AI to write enterprise software is similar. The purpose of enterprise application programs is to capture knowledge. Using AI specifically defeats that.

(Also true for outsourcing IT support — you donate information about the enterprise to contractors to exploit.)

@slott56 @barquq
I mean no one is paying me to learn about the system, they're paying me to have a system do something so people don't have to. The fact that I need to learn about the system to do that is incidental.
A sensible company will keep knowledge in house because if you loose too much of it you will no longer be able to adapt the system and then you won't be able to use it to do things so people don't have to, but again, that's transistory to the main aim.

@econads @barquq I disagree slightly on the priorities. The goal of automating something depends on knowledge capture which depends on understanding.

Since the automation must be auditable and observable, the understanding is primary. Automation doesn’t make the work go away. It shifts the burden from doing to confirming.

@slott56 @barquq
Hmm maybe English is failing us here and our definitions are a bit different :D
Aa anyone who has worked on legacy code knows, understanding is not needed every minute a system runs. It had to be understood once to write the program, and it has to be rediscovered (even in an incomplete way) to adapt or fix, but there can be stretches where no one in the company knows exactly what it does, and in fact I've often encountered "we can't change that, we might break it".
@econads @barquq you’ve pinpointed the problem I see all the time: “we can’t touch it because we don’t understand it.” That’s an organizational failure. Yes, it’s common. But it’s bad. And AI only creates more of it.
@slott56 @barquq
Oh yeah don't get me wrong I'm not arguing in favour of using AI to write code (although I'm guilty of it occasionally). I'm nit picking about the point of coding :-)
@econads @barquq Got it. Companies fail to understand what they’re doing on a regular basis. Your experience is common. And it reflects a failure to understand what code is. Not all companies treat code as an opaque necessity. Those that do are doomed to increasing IT costs and eventual buy-out by a company not as blind.

@slott56 @barquq
The thing is you're not paid to only learn about the code, you're paid to provide changes to it. No company is going to pay you to read or play with code without making any commits or releases or transferring that knowledge. The learning comes because that's a more efficient way of making changes sustainably.

Anyway, I guess we reached the end of the useful conversation and we're going to start going round in circles, so never mind :-) it was an interesting point you made.

@econads @barquq

No argument. Understanding is valuable when it permits change and adaptation. Something AI-generated can’t do.

@slott56 @barquq
No argument about that either

@barquq It's not me (as I'm just a "nobody personal blogger" 😅), but I've used the exact same image! 😊 https://havn.blog/2025/03/01/on-the-need-for-friction.html

"If you’re at the gym, there are many examples of how technology can enhance the effectiveness of our artificial physical exercise. However, using a forklift to lift weights might be more effective and comfortable, compared to doing it yourself – but it also makes the action completely useless!
(1/2)

On the Need for Friction

Imagine talking to a medieval farmer, about the concept of …

@barquq The point isn’t _that_ the weights get lifted, but that _you_ do it. This is in contrast with a warehouse, where the point is to get the stuff lifted."

I've used this image in conversations since 2022 (I'm a teacher). Cool to see this is something others has thought about as well!
(2/2)

@barquq Something like that could be said about most work students do in primary and secondary school, from arithmetic exercises to lab experiments to literary analysis. It could cover a lot of post-secondary work too, though at that point your results _might_ be more generally useful.

Using a calculator, fudging measurements, copying homework from others, getting "AI" to do it, or whatever mostly only defeats your education.

There is one caveat. Some people do education for the certifications -- a degree from an accredited university, or something else. If that's your goal, academic dishonesty, including but not limited to the use of "AI" tools, might be the most efficient way to achieve your goal.

@barquq Unless the goal is to train forklift drivers. They you need forklifts. But, we don’t train forklift drivers at the gym. Should we train AI users at public schools? I argue that it’s essential to train students to know how to use and to recognize the use of AI tools. It’s important to know when AI is useful and when it’s wasteful and/or deceitful. This facility is called critical thinking. We all need to be trained in it.
Sam Halpert (@samhalpert.bsky.social)

Even accepting the premise that AI produces useful writing (which no one should), using AI in education is like using a forklift at the gym. The weights do not actually need to be moved from place to place. That is not the work. The work is what happens within you.

Bluesky Social
Sam Halpert (@samhalpert.bsky.social)

Even accepting the premise that AI produces useful writing (which no one should), using AI in education is like using a forklift at the gym. The weights do not actually need to be moved from place to place. That is not the work. The work is what happens within you.

Bluesky Social
@barquq Have you seen Accelerate Your Learning with ChatGPT by Dr. Barbara Oakley (Learning How to Learn) and Dr. Jules White? https://www.coursera.org/learn/learning-chatgpt They propose using the AI to explain using metaphors that you are more likely to be familiar with and also to support retrieval practice.
Accelerate Your Learning with ChatGPT

Offered by Deep Teaching Solutions. Transform your approach to learning with AI! "Accelerate Your Learning with ChatGPT" brings together two ... Enroll for free.

Coursera
@barquq Good quote, but in the analogy you can still learn to drive a forklift. AI is a new tool that you should be aware of. But I completely agree that it should not replace learning itself.
@barquq Guess we better not let students studying accounting use Excel either, right? 🤪
@[email protected] I believe AI in education will be put to great effect eventually though, in some ways you could think of it as a gym building machine (?)
Sam Halpert (@samhalpert.bsky.social)

Even accepting the premise that AI produces useful writing (which no one should), using AI in education is like using a forklift at the gym. The weights do not actually need to be moved from place to place. That is not the work. The work is what happens within you.

Bluesky Social