More journalists making the connection. See this article by Sharon Goldman.

"Other AI experts have also pointed out, both publicly and privately, that they are concerned by the companies’ publicly-acknowledged ties to the EA community — which is supported by tarnished tech figures like FTX’s Sam Bankman-Fried — as well as various TESCREAL movements such as longtermism and transhumanism. "

https://venturebeat.com/ai/doomer-advisor-joins-musks-xai-the-4th-top-research-lab-focused-on-ai-apocalypse/

Doomer AI advisor joins Musk’s xAI, the 4th top research lab focused on AI apocalypse

A quartet of the world's most famous AI labs — OpenAI, DeepMind, Anthropic and now xAI — are laser-focused on AI's existential risks.

VentureBeat
@timnitGebru
Serious question: why is transhumanism getting painted with the same brush as singularity folks?

@digifox Feel free to watch my talk and read the associated academic paper that will come out.

Doesn't thelp that transhumanism was developed by the chair of the british eugenics society, Julian Huxely?

Maybe the stated goals, and maybe what "modern" transhumanist like Bostrom have been writing openly for decades?
https://www.youtube.com/watch?v=P7XT4TWLzJw

SaTML 2023 - Timnit Gebru - Eugenics and the Promise of Utopia through AGI

Eugenics and the Promise of Utopia through Artificial General IntelligenceBased on work by Timnit Gebru & Émile P. Torres

YouTube

@digifox I mean not like they don't know: “since we as transhumans are seeking to attain the next level of human evolution, we run serious risks in having our ideas and programs branded by the popular media as neo-eugenics, racist, neo-nazi, etc.”

Its easy not to be a eugenicist rather than worrying about the branding.

@timnitGebru @digifox The very fact they're still talking about levels of evolution when the whole point is self-directed modification into whatever we want shows they've just lost the chart entirely (or just never had it to start with, from the sound of it).

Any approach remotely similar to eugenics is completely defeating the point of it and represents a loss of such self-determination freedom, not a gain.

@lispi314 @timnitGebru @digifox Still have chills about an argument I had with an old friend in highschool about how the Singularity was near (like... 15 years ago? lol) and that those who didn't assent to it were in some way "backwards" and would be put in camps.

Granted, it was less about the whole putting people in camps thing and more that he couldn't wrap his head around why anyone wouldn't want to transcend humanity when the Singularity (to him) was a mathematical certainty... in a vacuum. But the fact that he couldn't see the inhumanity in what he was trying to tell me- the division of us vs them- was what really drove a wedge in our then friendship.

Odd that this would end up happening with another friend, too.

@lawlznet
@lispi314
Any idea can go sour. Communism, Christianity, Islam, Atheism (ugh, New Atheism's right wing takeover still makes me so sad.)

Let's say for the sake of argument your friend is right about the Singularity being nigh (I think that's nuts, to be clear.) What should be done about it? Do you want to freeze technological development to prevent it?

@digifox @lispi314 Depends on whose in charge of it. I feel like if the Singularity ever actually happened, the first act of whomever discovered it happening first would be to, ironically, lobotomize it and chain it up. The last thing any status quo wants is to create something they can't control, after all.

Part of said friend's miyopism is that he didn't understand that. He thought that such things were irrelevant and that the common man would get access to all the fancy peak technological progress.

Then we look at real life and see that even "dumb bots" like AI Dungeon cost money, are heavily censored, and have limitations.

If the first act of whomever discovers a Singularity in their basement is to weaponize it, maybe it would be better to "halt progress" through any means necessary and try it again. It's not like the whole "oh no, my decades of research!" destruction thing happens all that often in real life anymore- once it's determined that such a thing is possible, someone will figure it out again.

Maybe with more benevolent intentions.

@lawlznet @digifox The status quo benefiting scum are somewhat irrelevant though, it is perfectly feasible to create something away from their eyes and then spread it everywhere in such a way that the scum simply cannot put the cat back in the bag.

It's a bit similar to how "piracy" (it's called sharing) has helped avert copyright maximalists' attempts at killing culture.

@digifox @lispi314 Re: Atheism

imo atheism failed the day that it made the same mistake as every other organized belief system- its (loudest) adherents thinking themselves superior to others and taking actions to separate and elevate themselves, if not in the eyes of outsiders, then in the eyes of their peers, turning in on itself into this soup mix of dunning kruger infused echo boxing.

That's like the end stage of any group of people with a common passion though, you see this same shitty pattern with tv show and video game fandoms.