[read somewhere else] from Émile P. Torres, @xriskology :
"Friends, I just put together a website dedicated to "Silicon Valley pro-extinctionism." It includes a list of articles that I've recently published on the topic. In my view, this is one of the most important issues that almost no one is talking about."

https://www.xriskology.com/siliconvalleyproextinctionism

#siliconvalleyproextinctionism #siliconvalley #proextinctionism

RESOURCES | Émile P. Torres

Émile P. Torres

"I hope this long post (sorry about that!) provides a bit of conceptual clarity to the issue of SV pro-extinctionism. There are four main questions that SV pro-extinctionists disagree about: (1) whether posthumanity should be an extension of humanity or something entirely distinct, separate, and autonomous from us; (2) whether some current humans should join the ranks of these posthumans; and (3) whether our posthuman successors should embrace the same basic/core values as us. The fourth and fifth axes of disagreement concern what should happen once posthumanity arrives: should we coexist alongside them for the next million years? Should they immediately slaughter all of humanity? Or peacefully convince us to stop procreating? Or perhaps keep us caged in zoos until, I dunno, we decide to end things ourselves?

In conclusion, pro-extinctionism takes both traditional and Silicon Valley (SV) forms, yet even within the latter category, there are many variants. Hopefully, you now have a slightly better sense of the lay of the land.""

https://www.realtimetechpocalypse.com/p/do-all-silicon-valley-pro-extinctionists

#ProExtinctionism #SiliconValley #AGI #PostHumanism

Do All Silicon Valley Pro-Extinctionists Want You Dead? (Part 3)

Silicon Valley pro-extinctionists want to replace humanity with a new posthuman species. Here's how to understand the different kinds of pro-extinctionism circulating in Silicon Valley. (4,000 words.)

Realtime Techpocalypse Newsletter

"Meet Daniel Faggella, a pro-extinctionist who leads the so-called “worthy successor” movement, which hopes to create an advanced AGI system to replace humanity. He claims that “the great (and ultimately, only) moral aim of artificial general intelligence should be the creation of Worthy Successor,” which he defines as “a posthuman intelligence so capable and morally valuable that you would gladly prefer that it (not humanity) control the government, and determine the future path of life itself.” We can call this version of pro-extinctionism “digital eugenics.”

Earlier this year, Faggella held a “Worthy Successor: AI and the Future After Humankind” conference in a San Francisco mansion (where else?), which he claims was attended by “team members from OpenAI, Anthropic, DeepMind, and other AGI labs, along with AGI safety organization founders, and multiple AI unicorn founders.” This points to the appetite that Silicon Valley dwellers have for pro-extinctionism. According to a recent announcement, Faggella is planning another such conference later this month in New York City, which aims to place “diplomats, AGI lab employees, AI policy thinkers and others into one room to discuss the trajectory of posthuman life.”"

https://www.realtimetechpocalypse.com/p/the-growing-specter-of-silicon-valley

#SiliconValley #ProExtinctionism #AGI #AIPolicy #AI #Posthumanism

The Growing Specter of Silicon Valley Pro-Extinctionism (Part 1)

The digital eugenicist Daniel Faggella argues that humanity should be replaced by a "worthy successor" in the form of AGI — his view is comically absurd and profoundly dangerous. (3,300 words)

Realtime Techpocalypse Newsletter

"I’ve written at length about the growing influence of pro-extinctionist sentiments within Silicon Valley. Pro-extinctionism is, roughly put, the view that our species, Homo sapiens, ought to go extinct.1

Well, it looks like we just witnessed “the first example ever of someone openly being fired allegedly for wanting humanity to end.” At least that’s what some folks are saying, but I think this is misleading. The person in question wasn’t fired for being a pro-extinctionist. They were fired for holding a particular kind of pro-extinctionist view.

That distinction is crucial, as it points to a deeply problematic trend among Valley dwellers: more and more folks are embracing a “digital eschatology” (as I’ve called it before) according to which the future will, inevitably, be digital rather than biological. Debates among these people increasingly focus not on whether this digital future is desirable, but on which type of digital future is most desirable. Let’s dive in …"

https://www.realtimetechpocalypse.com/p/did-an-ai-company-just-fire-someone

#AI #AGI #ASI #ProExtinctionism #TransHumanism #DigitalEschatology

An AI Company Just Fired Someone for Endorsing Human Extinction

Michael Druggan, former xAI employee, is now trying to de-extinct his career.

Realtime Techpocalypse Newsletter

"We must call this out for what it is: pro-extinctionism. The biological transhumanism of Thiel and the digital eugenics of Altman, Faggella, Yudkowsky, and the others—all of these views aim to supplant the human species with some kind of successors, which would then proceed (on the most popular view) to plunder Earth’s remaining resources and launch itself into space to conquer the universe. This is overtly pro-extinctionist, and given that some of the most powerful people in one of the most powerful centers of society—Silicon Valley—accept it, we must conclude that pro-extinctionism is not a fringe ideology, but closer to the mainstream.

Even worse, these pro-extinctionists promote their ideology by claiming that, in fact, they oppose human extinction. This relies on two linguistic tricks: first, many define “humanity” in an idiosyncratic way that diverges from the definition that most of us intuitively accept. We tend to equate humanity with Homo sapiens, our species, whereas these people define “humanity” as including whatever posthuman descendants we might have. Hence, as I’ve highlighted elsewhere, our species could die out next year without “human extinction” having happened—so long as we’re replaced by posthumans, then “humanity” will live on. When they talk about avoiding “human extinction,” they aren’t talking about the extinction of our species. To the contrary, our extinction wouldn’t matter one bit once posthumanity arrives."

https://www.techpolicy.press/digital-eugenics-and-the-extinction-of-humanity/

#Transhumanism #DigitalEugenics #HumanExtinction #ProExtinctionism

Digital Eugenics and the Extinction of Humanity

If we are to combat the AI industry’s push to build digital gods, we have to understand the ideology of the antihumanist project, writes Dr. Émile P. Torres.

Tech Policy Press