More journalists making the connection. See this article by Sharon Goldman.

"Other AI experts have also pointed out, both publicly and privately, that they are concerned by the companies’ publicly-acknowledged ties to the EA community — which is supported by tarnished tech figures like FTX’s Sam Bankman-Fried — as well as various TESCREAL movements such as longtermism and transhumanism. "

https://venturebeat.com/ai/doomer-advisor-joins-musks-xai-the-4th-top-research-lab-focused-on-ai-apocalypse/

Doomer AI advisor joins Musk’s xAI, the 4th top research lab focused on AI apocalypse

A quartet of the world's most famous AI labs — OpenAI, DeepMind, Anthropic and now xAI — are laser-focused on AI's existential risks.

VentureBeat

"“I am very aware of the fact that the EA movement is the one that is actually driving the whole thing around AGI & existential risk,” @kchonyc told VentureBeat. “I think there are too many people in Silicon Valley with this kind of savior complex."

More researchers need to speak up. If you think its not coming for you, think again.

You'll find your institution subsumed, whether in BigTech, academia or outside.

We'll get to a place where there's no funding for anything except their agenda, and where all policy is driven by this.

Even if you want to ignore it and do your thing, you'll soon be unable to do so.

@timnitGebru @kchonyc EA is a cult. Once I read about Nauru, it became clear to me that this is an apocalyptic cult.
@timnitGebru
Serious question: why is transhumanism getting painted with the same brush as singularity folks?

@digifox Feel free to watch my talk and read the associated academic paper that will come out.

Doesn't thelp that transhumanism was developed by the chair of the british eugenics society, Julian Huxely?

Maybe the stated goals, and maybe what "modern" transhumanist like Bostrom have been writing openly for decades?
https://www.youtube.com/watch?v=P7XT4TWLzJw

SaTML 2023 - Timnit Gebru - Eugenics and the Promise of Utopia through AGI

Eugenics and the Promise of Utopia through Artificial General IntelligenceBased on work by Timnit Gebru & Émile P. Torres

YouTube

@digifox I mean not like they don't know: “since we as transhumans are seeking to attain the next level of human evolution, we run serious risks in having our ideas and programs branded by the popular media as neo-eugenics, racist, neo-nazi, etc.”

Its easy not to be a eugenicist rather than worrying about the branding.

@timnitGebru You don't need to be a eugenicist to want to augment the human form with technology and medicine and raise the baseline of human existence.

Cosmetic surgery, gender transition, fertility services, birth control, all represent providing humans with options above-and-beyond just repairing lost functionality or mitigating a disability. The fact that we shoehorn these things into "medicine" as traditionally studied and practiced represents a limited understanding.

After watching your talk through the Second-Wave Eugenics section, I think the "TESCREAL bundle" as you derisively describe it is being overly condemned here by examining its more ... ridiculous proponents. I feel like judging TESCREAL by Musk, Yudkowsky, Altman, Bostrom, etc is equivalent to judging Christianity, Islam, or Atheism by the worst examples of their adherents.

@digifox None of those advances were done under the guise of trans-humanism, nor needed to be.
@hackbod
No, but I don't see why transhumanism need be anything but the belief that destroying biological barriers is good and allows a broader range of human experience and expression. I fear backlash against it much like I fear degrowthers, pastoralism, and other reactionary tendencies that keep getting dressed up as progressive.
@hackbod
I respect and broadly agree with Dr Gebru's criticism of the current very specific cult surrounding "the Singularity" and the "promise of AGI" but ultimately can't help but feel it represents an overcorrection that technologists will come to regret supporting as we become mired in suspicion, over-regulation, and outright hostility.
@hackbod
Especially in light of the various attacks on individual and bodily autonomy happening in the US and across the globe.

@timnitGebru @digifox The very fact they're still talking about levels of evolution when the whole point is self-directed modification into whatever we want shows they've just lost the chart entirely (or just never had it to start with, from the sound of it).

Any approach remotely similar to eugenics is completely defeating the point of it and represents a loss of such self-determination freedom, not a gain.

@lispi314 @timnitGebru @digifox Still have chills about an argument I had with an old friend in highschool about how the Singularity was near (like... 15 years ago? lol) and that those who didn't assent to it were in some way "backwards" and would be put in camps.

Granted, it was less about the whole putting people in camps thing and more that he couldn't wrap his head around why anyone wouldn't want to transcend humanity when the Singularity (to him) was a mathematical certainty... in a vacuum. But the fact that he couldn't see the inhumanity in what he was trying to tell me- the division of us vs them- was what really drove a wedge in our then friendship.

Odd that this would end up happening with another friend, too.

@lawlznet
@lispi314
Any idea can go sour. Communism, Christianity, Islam, Atheism (ugh, New Atheism's right wing takeover still makes me so sad.)

Let's say for the sake of argument your friend is right about the Singularity being nigh (I think that's nuts, to be clear.) What should be done about it? Do you want to freeze technological development to prevent it?

@digifox @lispi314 Depends on whose in charge of it. I feel like if the Singularity ever actually happened, the first act of whomever discovered it happening first would be to, ironically, lobotomize it and chain it up. The last thing any status quo wants is to create something they can't control, after all.

Part of said friend's miyopism is that he didn't understand that. He thought that such things were irrelevant and that the common man would get access to all the fancy peak technological progress.

Then we look at real life and see that even "dumb bots" like AI Dungeon cost money, are heavily censored, and have limitations.

If the first act of whomever discovers a Singularity in their basement is to weaponize it, maybe it would be better to "halt progress" through any means necessary and try it again. It's not like the whole "oh no, my decades of research!" destruction thing happens all that often in real life anymore- once it's determined that such a thing is possible, someone will figure it out again.

Maybe with more benevolent intentions.

@lawlznet @digifox The status quo benefiting scum are somewhat irrelevant though, it is perfectly feasible to create something away from their eyes and then spread it everywhere in such a way that the scum simply cannot put the cat back in the bag.

It's a bit similar to how "piracy" (it's called sharing) has helped avert copyright maximalists' attempts at killing culture.

@digifox @lispi314 Re: Atheism

imo atheism failed the day that it made the same mistake as every other organized belief system- its (loudest) adherents thinking themselves superior to others and taking actions to separate and elevate themselves, if not in the eyes of outsiders, then in the eyes of their peers, turning in on itself into this soup mix of dunning kruger infused echo boxing.

That's like the end stage of any group of people with a common passion though, you see this same shitty pattern with tv show and video game fandoms.