I fucking hate ChatGPT and ai and all of that shit

https://lemmy.world/post/43978975

I fucking hate ChatGPT and ai and all of that shit - Lemmy.World

I’ve been working with so many students who turn to it as a first resort for everything. The second a problem stumps them, it’s AI. The first source for research is AI. It’s not even about the tech, there’s just something about not wanting to learn that deeply upsets me. It’s not really something I can understand. There is no reason to avoid getting better at writing.

I don’t know how to solve your core problem you are hinting at without society at large realizing many of our problems are the brainwashing of the masses. This problem is why we initially were taught math without calculators in my day, by college they were expected to help with simple math to focus on the more complicated problems.

Here with llms it’s important to still write, learn to research something (even more than the don’t use encyclopedias as a primary source) learning to read with deep understanding and learning to skim. Learning math and logic is as important as ever.

What I see missing quite a bit in the antiai art world is the importance of creating art to convey your meaning (if AI is a tool involved or not for writing images ect is this thing showing the meaning and nuance you want not just a off the top of your head comment and auto ship the slop output) and the only way you can go no that’s not what I want is to have some idea how to make the piece of writing or art yourself even at a high level.

I personally like the tech but see it accelerating the brain drain for those that rely on it too for answers as the learn.

Yeah, that’s the way I came up too. But I disagree with the “maths without calculators” approach - mainly because it feels like a brute-force solution that ignores the reality that calculators exist.

So does ChatGPT.

We should learn to use the tools we have, not pretend they aren’t there.

More importantly, using something like “do the maths the long way” as a proxy for teaching reasoning probably has limited transfer if it’s not framed explicitly. Like you, I learned a lot of logic through algebra - but no one ever connected those dots. I only realized years later that the real lesson was about reasoning, not just manipulating symbols.

What I’m getting at is:

  • the tools are already here
  • avoiding them isn’t realistic
  • teaching thinking indirectly through other skills is a pretty unreliable way to transmit it

If we actually care about developing thinkers, we probably need to teach reasoning, skepticism, and how to interrogate outputs directly, including outputs from tools like AI.