I am increasingly worried about the current AI hype cycle taking down all of computer science with it. The more I think about it, the field is on the verge of a legitimacy crisis from several root-causes.

I've seen some AI/ML people (Timnit, etc) talk about the need for an anti-AGI movement. I think that applies to #CS generally. If the public at large comes to equate CS with AGI, it will kill the whole field for a decade when AGI implodes.

#CompSci

How I see this happening:

* AGI gets falsely presented as a real thing. The singularitists amplify this.
* Snake-oil vendors pop up, offering supposedly AGI-based solutions for replacing whole swaths of jobs. Mid-2000s style outsourcing comes back in vogue.
* These fail utterly or get revealed as theranos-type scams backed by offshore labor.
* Public opinion turns sharply against CS generally, resulting in deep funding cuts, bad legislation, etc.

To be clear, I'm not advocating any kind of ludditism. Ludditism and primitivism need to die of fucking Ebola.

CS and particularly the industry needs to get back to basics: solving problems and enabling people.

We need to get *a lot* better at calling out kooks, crackpots, and snake-oil. We absolutely have our own analogues of anti-vaxxers, homeopathy, and flat-earthers. We've done a shit job of policing that, and we've allowed it to grow unchecked.

@emc2 agreed, but will add a small comment that Luddites were not anti-technology, they were anti exploitation that some employers engaged in by using technology in a particular way.

It wasn't about not using technology at all, it was about not using it to exploit people.

We *need* a modern-day Luddite movement — one that interrogates the hype around new technologies through the lens of how they affect labor and society in general.

@rysiek @emc2 +1, you had me until wishing a painful death on people you don’t agree with. That’s just messed up.

@philip @rysiek I *specifically* referred to -isms, not -ists, because they are abstract belief systems as opposed to people.

That was a deliberate semantic choice.