RE: https://tldr.nettime.org/@tante/116206664781934665

Like: I am not even a scientist or academic and even I see how garbage that idea is.

@tante As a scientist, I can say that yes, these people are unequivocally missing the point.

"Research" (reading literature, evaluating contrasting papers, etc) is different from "research" (theorizing or designing and doing experiments), and this latter definition cannot be done by these AI tools.

You need to interact with the real world in some way and those interactions are where you, the scientist, actually learn valuable things that are worth writing about and sharing.

@tante AI Booster: "but you can scan the literature and formulate hypotheses faster using LLMs so you can focus on the theorizing more"

The reading, criticism, and evaluation is where you figure out what questions and hypotheses are even worth investigating in the first place and how to go about doing the experimentation.

It's like practicing your scales to become a good composer - you need to know the fundamental building blocks and what's possible before putting together something of value

@jrhawley @tante this is what I tried to teach my students, so far I hopelessly fail at it. I know, they are not really interested in doing science themselves. Nonetheless, critical thinking, cannot work without doing the thinking part of it on one’s own. My students don’t agree. Or maybe they are not into critical thinking as much as I am.

@Maristya
Of course they're not. It's hard work. (My tongue is only partially inserted in my cheek)

@jrhawley @tante

@Maristya
One metaphor drives it home for my students.

"Using AI to study is like taking a forklift to the gym. The weights will move allright, but it would be pointless and you gained nothing."

I reckon for research it largely applies as well.
@jrhawley @tante

@Maristya

How much I'd wish I came up with that, I got the idea from Derek Muller (the Veritasium guy) who gave a talk on AI and learning a while ago. If you have a chance to watch it, he makes quite a few good points.

https://www.youtube.com/watch?v=0xS68sl2D70

@jrhawley @tante

Veritasium: What Everyone Gets Wrong About AI and Learning – Derek Muller Explains

YouTube

@jrhawley @tante yeah that AI booster argument is flawed because it's a statistics machine, and the don't even see that flaw because hype. It is more like:

"You can create songs similar to all the previous songs we've recorded."

That's a killer for theory work where NEW ideas should be worked out.

@ppxl @tante Exactly. Interesting new ideas often come from the synthesis and intersection of disparate old ideas, or examining old ideas from a new perspective that matters to someone. That's not what these AI tools are built to do and those perspectives and novel intersections come from people who care about those ideas.

@jrhawley @tante

They're also missing that putting in the hours and actually consciously engaging with the material is how you build the understanding and intuition required to pose new questions. You can't short-circuit that process.

(This is the same thing they're oblivious to in software development too: when you truly understand the problem, writing the code is trivial. When you don't, writing the code is how you gain understanding of the problem.)