There's yet another "AI will kill us all! It poses a risk of extinction!" letter going around, and I just… Y'all i am just so fucking tired.

CAPITALISM poses risk of extinction (climate change, right the fuck now).

WHITE SUPREMACY poses risk of extinction (genocide, eugenics).

HEGEMONY poses risk of extinction (nuclear FUCKING WAR).

And whatever "risk of extinction" "AI" poses, it poses because it is BUILT FROM THOSE EXTREMELY HUMAN VALUES.

Even if you stopped every "AI" project running, RIGHT THIS SECOND, those values would still kill us. And no matter how long you "pause" your "AI" projects, if you don't address those values? Then when you start your "AI" back up? You'll KEEP BUILDING THOSE SAME VALUES IN.

This is not hard. At this point, as much as it pains me to say it, it's not even novel. And yet you're still not fucking getting it.

I'm so goddam tired.

@Wolven
Preach. Tools don't kill things: The wielders do.

Modern AI is a truly revolutionary technology that has the potential to change the world, and it certainly will. The issue that poses is how we deal with that change; if we're not careful, thoughtful, deliberate, we could do unspeakable damage.

@WelcomeToTheCafe how the tools are formed out of and embedded into the culture matters a great deal, too.

@Wolven
I can agree with that, certainly.

Thank you for your post!