There's yet another "AI will kill us all! It poses a risk of extinction!" letter going around, and I just… Y'all i am just so fucking tired.

CAPITALISM poses risk of extinction (climate change, right the fuck now).

WHITE SUPREMACY poses risk of extinction (genocide, eugenics).

HEGEMONY poses risk of extinction (nuclear FUCKING WAR).

And whatever "risk of extinction" "AI" poses, it poses because it is BUILT FROM THOSE EXTREMELY HUMAN VALUES.

Even if you stopped every "AI" project running, RIGHT THIS SECOND, those values would still kill us. And no matter how long you "pause" your "AI" projects, if you don't address those values? Then when you start your "AI" back up? You'll KEEP BUILDING THOSE SAME VALUES IN.

This is not hard. At this point, as much as it pains me to say it, it's not even novel. And yet you're still not fucking getting it.

I'm so goddam tired.

Here. I've already said all this. Been saying it for damn near 20 years. Tired. Fucking wearied.
https://afutureworththinkingabout.com/?page_id=5038
Curriculum Vitae | A Future Worth Thinking About

Even if we take these dudes at their word that they really believe this, then regardless of mechanism, even "AI" would only think to kill us all because we— humans— modeled that to it as something to learn from and emulate.

And these dudes genuinely refuse to grapple with that fact

@Wolven "something something instrumental convergence"
@Wolven So I inspired myself to finally read the wikipedia on this “instrumental convergence” I keep hearing about — and as I expected, but even more so, it is truly a tour de force of self justification. These people projected so hard, they made their pathologies into a universal law of the cosmos. https://en.m.wikipedia.org/wiki/Instrumental_convergence
Instrumental convergence - Wikipedia

@Wolven No explicit defense of imperialism, genocide, or slavery in the article, but from what we know of Bostrom and his crew, you know those have been made. It feels like a skeleton key that explains so much - even Hinton’s wild comment that "I don’t know any examples of more intelligent things being controlled by less intelligent things" @FeralRobots https://mastodon.social/@FeralRobots/110317139645593097

@misc @Wolven ahahaha

I might summarize "instrumental convergence" as "everybody's gotta be an asshole to get what they want"

which really makes the "telling on yourself" clearer

@trochee @Wolven It's just incredible how many assumptions are packed in at every step to make this ostensible law say (and justify) what they want.
@misc @Wolven there's even a section on "if you don't have to be an asshole to get it, your wants aren't big enough to count"