You may have heard the acronym TESCREAL. It bundles seven toxic ideologies.

- Trans-humanism
- Extropianism
- Singularitarianism
- Cosmism
- Rationalism
- Effective Altruism
- Longtermism

Here are my personal notes¹ on what each ideology means:

Transhumanism (T) is the belief in enhancing the human mind and body with artificial intelligence, genetic engineering, and other technologies to eventually evolve to god-like figures (posthumans) as the next step of evolution. Think "Homo Deus". Transhumans—still recognizably human but enhanced—are the transitory path to such posthumans.

Extropianism (E) is the belief in the eventual realization of indefinite lifespans through nanotechnology and similar technologies. It's the belief that through mind uploading and future advances in biomedical technology, even immortality will be achieved. It's the belief that in the future, those whose brains and bodies have been preserved through cryonics, will be able to revived.

Singularitarianism (S) is the belief that superintelligence will likely be created in the medium future—that event will be what Ray Kurzweil calls a technological singularity—and that this is desirable if guided prudently. Singularitarianism can be understood as the realization of what AI enthusiasts call AGI.

Cosmism (C) subsumes the three ideologies transhumanism, extropianism and singularitarianism, and adds to them the belief that humans who left biology behind by merging with technology and uploading their minds, will spread to the stars and roam the universe, and create virtual worlds—synthetic realities—in which posthumans could live. It focuses less on becoming posthuman and instead concerns itself with how such posthumans would transform the universe.

¹ I wildly copied from Wikipedia pages and Timnit Gebru's and Émile P. Torres's paper. I rearranged the copied snippets for my personal notes that I hereby make public. This is not an academic paper. I'm not claiming that I am who came up with this. All credit to the original authors.
https://firstmonday.org/ojs/index.php/fm/article/view/13636

1/3

#TESCREAL #capitalism #fascism

The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence | First Monday

Rationalism (R) refers to the belief of the community that reads blogs such as LessWrong, that rationality should be trained and reasoning should be improved to make better decisions. This community believes that all aspects of life should be decided solely based on rational thinking. They are suspicious of how emotions might inhibit rational thinking. Forget feelings of love or attraction when it comes to partner selection, account for your "cognitive biases", and decide based on what is rational above all by ignoring your feelings. Believers of this ideology aren't necessarily followers of other ideologies in the TESCREAL bundle, but they are often sympathetic with the transhumanist worldview, and one of the most popular topics in the LessWrong forum is the creation of superintelligence and singularitarianism.

Effective Altruism (EA) can be understood as the belief in the application of rationalism to ethics. Being heavily influenced by utilitarianism, believers of this ideology would let 100,000 people die today without blinking twice if that would somehow improve the lives of 1 billion people more than the value of the lost lives. Believers see their sole moral responsibility in maximizing the total quantity of "value" in the universe. Although this ideology might have initially concerned itself with doing the "most good" possible with finite resources, it's purpose often is to justify immoral cuttings of humanitarian aid. It's the belief that if humans have, on average, net-positive lives, then more humans in the universe means more value in the universe. Thus follows the belief in a moral duty to maximize the human population under this constraint that everyone lives, on average, a net-positive life. This belief overlaps with cosmism and the desire to create virtual worlds that could house an unfathomable number of posthumans whose net-positive lives would add immense value to the universe.

2/3

#TESCREAL #capitalism #fascism

Longtermism (L) is a continuation of effective altruism: It assumes that humanity will succeed in colonizing the entire universe or creating virtual worlds that would then contain unimaginable numbers of posthumans. Under these assumptions, it would create much more total "value" in the universe if today's limited resources would be spent to the benefit of these future posthumans that will only be born in billions or even trillions of years from now, than to the benefit of the humans living today. Believers claim to aim to positively affect the greatest number of people possible. In this belief system, most people who could exist, will exist in the far, far future. Believers argue that humanity should stop focusing on currently living people and their contemporary problems, unless it would influence the distant future, because this approach is a suboptimal use of resources on the mission to maximize total value. Humanity should instead focus on what would benefit the population in the far, far future, which adds more total value to the universe due to the unimaginably larger number of people in the future population. This suspiciously somehow always seems to coincide with what benefits billionaires today.

Representatives of each toxic ideology have been platformed on the #LexFridman podcast. From Ray Kurzweil to Nick Bostrom to Ben Goertzel to Eliezer Yudkowsky to William MacAskill to Elon Musk.

I've watched colleages fall victim to these toxic ideologies because they didn't understand them. They naively and cluelessly thought of Lex Fridman as a role model and an intelligent, disciplined bigger brother who is good at playing guitar and martial arts. They'd thought the platformed guests were just random people with interesting ideas, rather than a concerted effort to spread these toxic ideologies among young (and not so young) men.

And I haven't even mentioned effective accelerationism or figures like Guillaume Verdon or Marc Andreessen yet. They want to maximize the likelihood of the realization of AGI (see singularitarianism) by turning the knobs of capitalism to eleven. And everyone should know this means fascism.

Sources: The TESCREAL paper and the relevant Wikipedia pages

3/3

#TESCREAL #capitalism #fascism