0 Followers
0 Following
1 Posts
I forgot that Ivy Astrix says that someone called chaosprime who was active in Vibecamp and TPOT (a rationalist-adjacent Twitter community with a summer camp) turned out to be a convicted child molester. Funny how that keeps on happening!
Reflecting On a Few Very, Very Strange Years In Silicon Valley's Rape Culture

On rape culture in Silicon Valley and its subcultures: Rationalism, Effective Altruism, TPOT, and Vibecamp. Also the informal rules that dictate how women are allowed to talk about sexual violence.

The Asterisk
An Effective Altruism Forum post accuses Michael Vassar of very rude behaviour to younger female female partners, and of telling a 22 year old woman that it was noble and educational for older men to have sex with girls as young as 12. That is two very common archetypes (the Libertarian Pedophile and the Christian Grey who can only dominate people with very little experience) but not Aella’s sexuality.
Abuse in LessWrong and rationalist communities in Bloomberg News — EA Forum

This is a linkpost for https://www.bloomberg.com/news/features/2023-03-07/effective-altruism-s-problems-go-beyond-sam-bankman-fried#xj4y7vzkg …

Aella's Influence on Rationalist Kink Practices (cursed phrase)

https://awful.systems/post/7630417

Aella's Influence on Rationalist Kink Practices (cursed phrase) - awful.systems

The essay by Noelle Perdue [https://noelleperdue.substack.com/p/dying-losers] has some blind spots but she was struck by one of their kink practices: > The wider network of Effectively Altruistic, Bay Area AI tech brotherhood has been covered on and off- in varying degrees of concern- for their seemingly wide community interest in kink, BDSM and “Consensual Non-Consent,” aka rape play. I experienced this myself, sitting in a circle of self-identified rationalists as they explained to me the pleasures of “red means no” parties; full-contact “rape orgies” where participants are encouraged to fight back. Scott Alexander and Scott Aaronson mostly want a woman to produce and raise babies. Gwern does not seem to post much about sexuality. Kelsey Piper probably keeps that to Tumblr and Project Lawful although she is queer and polyamorous. Yudkowsky is into dominance, sadism, and horny Japanese pop culture. Brent Dill liked master/slave relationships with much younger women which are a kind of consensual non-consent. Polyamory is big in this subculture. I don’t know much about Burning Man culture. But I can’t recall anyone in Bay Area rationalism and EA expressing interest in rape parties until Aella showed up. So is this like Yudkowsky spreading AI doomerism, and Alexander spreading neoreaction? There is a difference between old school SoCal kink, where you spend a lot of time making fursuits and paddles and occasionally use them with someone fetching, and Aella’s version where you rent a house or a field and go to town on each other. The Rationalists don’t like the protective measures which kinksters have learned from experience, like limiting or banning substance use, safewords, and joining a national or international kink community so you can get a second opinion about that proposition on FetLife. (Yudkowsky keeps posting “of course I use safewords, but what if I didn’t” and I have seen a claim [https://archive.ph/SFCwS] that the rape parties involve games like drugs roulette). Many of them are hostile to mainstream ideas of informed consent, preferring a Libertarian approach where if you sign a contract what happens after is your responsibility.

A Bad Trip in Berkeley

https://awful.systems/post/7336009

A Bad Trip in Berkeley - awful.systems

LSD and other psychedelics are popular in LessWrong and there are hints that many of them have had bad experiences or have been abused while they were under the influence. While I am still working on a longer post, it looks like the critical period was 2016-2018. Some of this post may appear in that future longer post with fuller links. Early Days Before that, Shannon F. (who dated Yudkowsky and lived with the founder of the New York City chapter of LessWrong), gwern, Scott Alexander, and Aella posted positive to cautious views of psychedelics. Scott Alexander was interested in their medical uses, gwern self-medicated, Aella and Shannon F. had a less clinical approach. SlateStarCodex helpfully polled readers about their substance use, and 17% of respondents said they had used LSD [https://www.astralcodexten.com/p/nootropics-survey-2020-results] in 2020. About 12% of Americans said they have tried psychedelics [https://www.sciencedaily.com/releases/2025/04/250421221118.htm] in 2023 so the response at SlateStar is quite high. Looking Back in 2021 Jessica Taylor says [https://archive.ph/epyMh] that she both got a job with MIRI/CFAR and self-medicated with psychedelics in 2017. She told the story in 2021 as follows: > the psychotic break was in October 2017, and involved psychedelic use (as part of trying to “fix” multiple deep mental problems at once, which was, empirically, overly ambitious); although people around me to some degree tried to help me, this “treatment” mostly made the problem worse, so I was placed in 1-2 weeks of intensive psychiatric hospitalization, followed by 2 weeks in a halfway house. This was followed by severe depression lasting months, and less severe depression from then on, which I still haven’t fully recovered from. I had PTSD symptoms after the event and am still recovering. (“My experience at and around MIRI and CFAR (inspired by Zoe Curzi’s writeup of experiences at Leverage)”) Taylor does not say where she got the idea to self-medicate with these substances, or that anyone encouraged her to do so, but the idea of self-medicating with psychedelics was certainly part of LessWrong online culture by 2018 and people tend to be more willing to talk about illegal activities than post about them. 2016-2018 was the period when CFAR switched to focus on AI risk and was run by people close to Brent Dill. Many other people in the community chip in about their own experiences: > humantoo: During my psychotic break, I believed that someone associated with Vassar had administered LSD to me. Although I no longer hold this belief, I cannot entirely dismiss it. > jessicata: I remember someone who lived in Berkeley in 2016-2017, who wasn’t a CFAR employee but was definitely talking extensively with CFAR people (collaborating on rationality techniques/instruction?) and had gone to a CFAR workshop, telling me something along the lines of “CFAR can’t legally recommend that people try LSD, but…”; I don’t remember what followed the “but”, I don’t think the specific wording was even intended to be remembered (to preserve plausible deniability?), but it gave me the impression that CFAR people may have recommended it if it were legal to do so, as implied by the “but”. This was before I was talking with Michael Vassar extensively. This is some amount of Bayesian evidence for the above. > Eliezer Yudkowsky: @Marie La I disagree and think the woo has proven in empirical practice to be sufficiently destructive to people who can’t see the destruction, to reach a level where it should not be tolerated by this group as a future subgroup norm, same as LSD use shouldn’t be tolerated by us as a subgroup norm. > Aella: @Eliezer Yudkowsky On phone so thumb words but I notice I have a belief that this is predictable, and thus not dangerous? or rather, it’s something like if you’re religious and noticed some ppl have been drinking alcohol and then eventually losing their faith, you might be right to be wary of alcohol, but if you know that it’s actually the doubt of their faith that causes the alcohol drinking, then you wouldn’t be concerned if someone drinks alcohol but also isn’t doubting their faith. > Eliezer Yudkowsky: My sense of “this seems to be ending very poorly on average” is much stronger for situations in which there’s a Leader or a Discernible Subgroup has formed, that are going up to others and saying “why, you really should try some psychedelics / woo”. Or where they wander up to individuals trying that, and put their arm around their shoulders all friendly-like. It goes on, if you are interested in psychedelics please ask harm-reduction experts where you live not our friends! A deleted comment by PhoenixFriend [https://web.archive.org/web/20211129101704/https://www.greaterwrong.com/posts/MnFqyPLqbiKL8nSR7/my-experience-at-and-around-miri-and-cfar-inspired-by-zoe/comment/dLyEcki7dBdxFkvJd] contained the following: > I’m a present or past CFAR employee commenting anonymously to avoid retribution. I believe that the dynamics of the organization grew to be significantly more cult-like than the OP and readers realize. I say all this with the hope that future-CFAR will emerge like a phoenix, leaving these issues in the ashes. … Psychedelic use was common among the leadership of CFAR and spread through imitation, if not actual institutional encouragement, to the rank-and-file. This makes it highly distressing that Michael (probably Michael Vassar) is being singled out for his drug advocacy by people defending CFAR. The description of Scientology auditing and Maoist self-criticism at CFAR continues, reader beware. Duncan Sabien objects to PhoenixFriend’s characterization of what happened at CFAR. Jax Romana accused Yudkowsky of dosing two women with psychedelics to control them [https://old.reddit.com/r/SneerClub/comments/awvems/in_which_ozy_is_confused_by_rsneerclub_and_math/] in 2018. To the best of my knowledge these allegations have not been made by anyone else and have not been independently confirmed. A Hypothesis Could a period of heavy experimentation at CFAR and in MIchael Vassar’s circle in 2016-2018 have given other rationalists cold feet? The comments on the thread by Jessica Taylor are the oldest post I can find where a prominent rationalist suggests creating community norms to limit psychedelic use. Something must have really scared Yudkowsky if he is willing to tell his followers “Please seriously consider not doing drugs.” He is very reluctant to tell people what to do except give his organizations money. There are many people in SoCal who could have told them that the ways they wanted to use drugs, kink, and reprogramming techniques were very dangerous. I picked up the gist just from reading fannish publications.

In another thread, I have posted about Form 990 for SIAI in 2009: was a Ben Goertzel an employee, and was the $50k donation from Jeffrey Epstein passed on to another organization such as OpenCog or kept in house? Form 990 says “not an employee” and “kept in house” but people who were staffers at the time tell different stories.
Eliezer’s response to being in the Epstein files - awful.systems

originally posted in the thread for sneers not worth a whole post, then I changed my mind and decided it is worth a whole post, cause it is pretty damn important Posted on r/HPMOR roughly one day ago full transcript: Epstein asked to call during a fundraiser. My notes say that I tried to explain AI alignment principles and difficulty to him (presumably in the same way I always would) and that he did not seem to be getting it very much. Others at MIRI say (I do not remember myself / have not myself checked the records) that Epstein then offered MIRI $300K; which made it worth MIRI’s while to figure out whether Epstein was an actual bad guy versus random witchhunted guy, and ask if there was a reasonable path to accepting his donations causing harm; and the upshot was that MIRI decided not to take donations from him. I think/recall that it did not seem worthwhile to do a whole diligence thing about this Epstein guy before we knew whether he was offering significant funding in the first place, and then he did, and then MIRI people looked further, and then (I am told) MIRI turned him down. Epstein threw money at quite a lot of scientists and I expect a majority of them did not have a clue. It’s not standard practice among nonprofits to run diligence on donors, and in fact I don’t think it should be. Diligence is costly in executive attention, it is relatively rare that a major donor is using your acceptance of donations to get social cover for an island-based extortion operation, and this kind of scrutiny is more efficiently centralized by having professional law enforcement do it than by distributing it across thousands of nonprofits. In 2009, MIRI (then SIAI) was a fiscal sponsor for an open-source project (that is, we extended our nonprofit status to the project, so they could accept donations on a tax-exempt basis, having determined ourselves that their purpose was a charitable one related to our mission) and they got $50K from Epstein. Nobody at SIAI noticed the name, and since it wasn’t a donation aimed at SIAI itself, we did not run major-donor relations about it. This reply has not been approved by MIRI / carefully fact-checked, it is just off the top of my own head.

A series of talks in the Epstein documents

https://awful.systems/post/7127738

A series of talks in the Epstein documents - awful.systems

Does anyone know what this June 2019 text from Epstein [https://www.justice.gov/epstein/files/DataSet%209/EFTA00508126.pdf] is about? I have added some links to RationalWiki and Wikipedia but not corrected spelling. Was it at one of the institutions he sponsored like MIT Media Lab? Or more like his conference in the Virgin Islands? It seems to mix mainstream figures and people in the Libertarian/LessWrong network. Another correspondent in 2016 suggested inviting Scott Alexander Siskind [https://www.justice.gov/epstein/files/DataSet%209/EFTA00824072.pdf] to speak at a different event Epstein was involved in. The correspondent has a Substack [https://joscha.substack.com/p/on-the-jeffrey-epstein-affair] which cites Siskind in 2025. Obviously just because Epstein had heard of a public figure does not mean that they knew him. Epstein’s words begin below: * List for summer talks. avid Pizarro [https://en.wikipedia.org/wiki/David_A._Pizarro]. Professor of Psychology and Philosopher at Cornell Univcrsit * Eric Weinstein [https://rationalwiki.org/wiki/Eric_Weinstein], Mathematician * Matthew Putman, Scientist * Paul Saffo, Technology Forecaster, and Professor of Engineering * Lori Santos, Professor ofPsychology and Cognitive Science * Janna Levin, Theoretical Cosmologist * Ev Williams, Internet Entrepreneur * Phoebe Waller-Bridge [https://en.wikipedia.org/wiki/Phoebe_Waller-Bridge], Author * Heiner Gocbbels, Composer, and Director * Martine Rothblatt, Lawyer and Entrepreneur * Peter Thiel [https://rationalwiki.org/wiki/Peter_Thiel], Venture Capitalist, and Entrepreneur * Richard Thaler, Behavioral Economics * Barbara Tversky, Professor of Psychology * Michael Vassar [https://rationalwiki.org/wiki/MetaMed], Futurist, Activist * bret Weinstein [https://rationalwiki.org/wiki/Bret_Weinstein], Biologist, and Evolutionary Theorist * Susan Backfield, MIT President, Professor of Neuroscience * David Deutsch, Physicist * Eliczer Yudkowsky [https://rationalwiki.org/wiki/Eliezer_Yudkowsky], Al Researcher * N. Jeremy Kasdin, Astrophysicist * Carl Zimmer, Science Writer * Douglas Rushkoff, Media Theorist * Eric Topol, Cardiologist * Dustin Yellin, Artist * Sherry Turkic, Professor of Social Studies * Taylor Mac, Actor * Stephen Johnson, Author * Martin Hagglund, Swedish Philosopher and Scholar of Modernist Literature * Thomas Metzinger, Philosopher, and Professor of Theoretical Philosophy * Bjarke Ingels, Danish Architect, Founder of BIG, currently working on Floating Cities/Sustainable Habitats project * Kai-Fu Lee, Venture Capitalist, Technology Executive, and Al Expert, developed the world’s first speaker-independent continuous speech recognition system * Poppy Crum, Neuroscientist, and Technologist, Chief Scientist at Dolby Laboratories, Adjunct Professor at Stanford University (Computer Research in Music) * Neil Burgess, Researcher, and Professor of Cognitive Neuroscience, investigating the role of the hippocampus in spatial navigation * Paul Sloom, Psychologist, and Researchvr exploring how children and adults understand the physical and secin’ world, with a special focus on language, religiom and morality * Brian Cox [https://rationalwiki.org/wiki/Brian_Cox], Physicist, and Professor of Particie Physics, Presenter of Science Programs * Eythor Bender. CEO of Berkeley Bionics, I nnovator and Business Leader in human augmentation (bionics and robotica) * Gwynnc Shotwell President. and COO at SpaceX, Engineer. lista! in 2018 as the 59th most powcrful woman in thc world by Forbcs * Jaap de Roodc. Associatc Professor of Evolution (of parasites) and Ecology, facusing on how parasites attack monarch butterflies and in return how butter0ies have the ability to self-mcdicatc * Jim Holt, American Philosopher, and Contributor to the Ncw York Times writing on string theory, time, thc universc, and philosophy * Vijay Komar, Indian Roboticist and UPS Foundation Professor in School ofEngineering & Applied Science:. became Dean of Penn Engineering, studies flying and cooperativc robots * Hugh Hcrr, Biophysicist, Engineer, and Rock Climber, builds prosthetic knces, legs, and onkies that fusc biomechanics with microprocessors at MIT * Gabriel locman, French Economist at UC Berkeley. best known for his research on tax havens, inequalities, and global wcalth * Fci-Fei Li, Professor of Computer Science, Director of Stanford’s Human-Ccntered Al, worIcs as Chief Scientist of Al/ML of Google Cloud * Dennis Hong, Korean Amcrican Mechanica! Engineer, Professor and Founding Director of RoMeLa (Robotics & Mechanisms Laboratory) of thc Mechanica) & Acrospace Engineering Dcpartment at UCLA * Misha (Mikhail) Leonidovich Gromov, American

CFAR lists nine employees with six-figure salaries plus a president. Oliver Habryka is one of those employees at the lower end of the pay scale. LightCone lists Habryka with a $3,000 honorarium and $110,000 in other salaries and expenses which looks like one or two system administrators or IT technicians. In 2024 Lightcone Infrastructure gave most of its expenses to something called Lightcone Research which actually operates LessWrong, and I predict that in 2026 LightCone will give most of the money raised to CFAR to pay the mortgage on the Rose Garden property and be very worried about Robot God.

In December LightCone raised $1.6 million of donations plus a 12.5% matching donation from the Survival and Flourishing Fund. They threatened to shut down if they didn’t raise $1.4 million and wanted at least $2 million.

Jaan and SFC (Jaan Tallinn and the Survival and Flourishing Corp) helped us fund the above-mentioned settlement with the FTX estate (providing $1.7M in funding). This was structured as a virtual “advance” against future potential donations, where Jaan expects to only donate 50% of future recommendations made to us via things like the SFF, until the other 50% add up to $1.29M in “garnished” funding. This means for the foreseeable future, our funding from the SFF is cut in half.

Lightcone Infrastructure did not list any large liabilities like this on its 2024 form 990, but CFAR listed several things which could cover it if the settlement was in 2024.

In December MIRI raised $1.6 million in donations plus a 100% matching donation from SFF. They wanted a total of $6 million. The donations grew from $1 million to $1.6 million in the last few days, suggesting that they talked a few of their upper-middle-class supporters into chipping in amounts in the high tens or low hundreds of thousand to capture the matching donation. Both fundraisers reached their minimum targets but not their goals.

Toss a bitcoin to your Lightcone – LW + Lighthaven's 2026 fundraiser — LessWrong

TL;DR: Lightcone Infrastructure, the organization behind LessWrong, Lighthaven, the AI 2027 website, the AI Alignment Forum, and many other things, n…

In November 2024, Habryka also said " we purchased a $16.5M hotel property, renovated it for approximately $6M and opened it up … under the name Lighthaven." So the disconnect between what Lightcone says to the taxman (we are small bois, CFAR owns the real estate) and what it says to believers (we own the real estate) was already there.
(The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser — LessWrong

TLDR: LessWrong + Lighthaven need about $3M for the next 12 months.

It appears that MIRI had an inflection point in 2019 when they grew from a $3.6m / yr organization to a $6 million / yr. In 2021 they received $25 million in donations and they have been burning that ever since. They received $15.6m in crypto from one anonymous donor and $4.4m in crypto from Vitalik Buterin of Etherium in 2021. Since then they report $1.6m to $1.9m / year of donations, and their December 2025 fundraiser aimed at $6 million and has reached $1.2 million (half of that a match from Survival and Flourishing Fund ie. probably another tech exec or crypto gambler). So MIRI depends on a few rich donors and could not survive in its present form from ordinary rationalists chipping in.

Their executive compensation exploded from $1.3m to $3.1m in 2024. Their other activities were giving out $280k in grants and spending $34k on a conference. What are they doing with all that money? You don’t need $3.1m to write a trade book and get on some podcasts.

Our all-time largest donation, and major crypto support from Vitalik Buterin - Machine Intelligence Research Institute

I’m thrilled to announce two major donations to MIRI!   First, a long-time supporter has given MIRI by far our largest donation ever: $2.5 million per year over the next four years, and an additional ~$5.6 million in 2025. This anonymous donation comes from a cryptocurrency investor who previously donated $1.01M in ETH to MIRI […]