RE: https://social.coop/@luis_in_brief/116388921310943162

People aren't building "AI" *tools* for Wikipedians, they're building "AI" weapons to deliberately injure the maintenance of human knowledge.

That nuance is extremely important because something that's genuinely a tool must be genuinely useful in some way. Knives, lighters, and cars get to be tools, because for all the immense harm that can be done with them if they're misused... There are actual, genuine use cases for them.

That absolutely doesn't apply to "Gen AI," and I will die on that hill.

@kimcrawley Agree with this. GenAI is incapable of critical evaluation, only statistical significance. It can't tell if a translation preserves meaning; it has no concept of meaning. It can't tell if a statement is true; it has no concept of truth.

GenAI can do one thing: output. Wikipedia faces many challenges and "not enough text" has never been one of them. The vision of generating text, and therefore inaccuracies, at scale for editors to fix is utterly demented.