0 Followers
0 Following
5 Posts
Wait, Julia Galef the skeptic married the guy who took Scientology courses and joined their Toastmasters club?

Valentine also asked Where’s the economic incentive for Wokism coming from? in 2022. He was one of the four co-founders of CFAR and left because he had been too close to Brent Dill. Previous discussion here. Someone tried to explain:

The reality is that in the Anglosphere there are lots of progressive people with money to spend on media. You can sell “woke” media to those people, and lots of it. Even more so when there’s controversy and you can get naive lefties to believe paying money to the megacorp to watch a mainstream show is a way to somehow strike back against the mean right-wingers. And to progressive people it doesn’t feel like “being lectured to about politics”, because that’s not what media with a political/values message you agree with feels like. So going woke is 100% a profit-motivated decision.

Where's the economic incentive for wokism coming from? — LessWrong

I'm trying to understand why several systems (Hollywood, Disney, universities, probably others) that are normally quite profit-focused are leaning so…

Vassar’s Twitter feed is retweets about solar power is awesome, the state should summarily execute murders if they are “drug-addicted parasites,” the UK should attack Iran because Iran attacked the US base on Diego Garcia, and Crémieux Receuil / Jordan Laskter. To me that is clearer and more important than exactly what he said about psychedelics ten years ago!
CFAR is Back - awful.systems

Most of Bay area LessWrong operated within two nonprofits, MIRI and CFAR [https://rationalwiki.org/wiki/CFAR]. CFAR was ostensibly about live-in workshops teaching rationality skills, you had to dig deeper to see that the skills were to make you a better Effective Altruist or AI ‘risk’ ‘researcher’. Up to the end of 2024 LessWrong and the Lighthaven campus operated within CFAR as independent projects. CFAR proper does not seem to have done much from spring 2020 to spring 2025, but their head Anna Salamon has started to organize new events. Some highlights: - since 2018 they mortgage their own bed-and-breakfast at a mansion in Bodega Bay, CA (about 10% as expensive as Lighthaven in Berkeley) - one of their founders left to work as a quant for Jane Street Capital - Jessica Taylor had something to say about Salamon in her 2021 debate with Scott Alexander [https://www.lesswrong.com/posts/pQGFeKvjydztpgnsY/occupational-infohazards] about whether MIRI and CFAR were a lot like the Vassarites and Leverage. > Anna Salamon expressed discontent that Michael Vassar was criticizing ideologies and people that were being used as coordination points, and hyperbolically said he was “the devil”. Michael Vassar seemed at the time (and in retrospect) to be the single person who was giving me the most helpful information during 2017. … Anna Salamon frequently got worried when an idea was discussed that could have negative reputational consequences for her or MIRI leaders. She had many rhetorical justifications for suppressing such information. This included the idea that, by telling people information that contradicted Eliezer Yudkowsky’s worldview, Michael Vassar was causing people to be uncertain in their own head of who their leader was, which would lead to motivational problems (“akrasia”). (Vassar tweets things like “Aspergers started out as a malphemism for that Ashkenazi heritage though.” [https://xcancel.com/HiFromMichaelV/status/1633481164441755649#m] and people who have met him say he argues that pedophilia is educational! If you think he provides helpful information that is bad news!) - their June 2026 workshops [https://www.lesswrong.com/posts/5AGK8b3rm8YD7jhxi/cfar-is-running-an-experimental-mini-workshop-june-2-6] were at Lighthaven - Duncan “punch bug” Sabien appeared in the comments of a post in September [https://www.lesswrong.com/posts/AZwgfgmW8QvnbEisc/cfar-update-and-new-cfar-workshops] to say that he would not recommend attending an event with any of these people. He ran Dragon Army while holding down a day job with CFAR and now has a Substack blog. - in December Salamon published a retrospective [https://www.lesswrong.com/posts/4W8ZbcRr47x9bNEf6/what-s-going-on-at-cfar-updates-and-fundraiser] that dances around what went wrong and what she will do differently next time - their fundraiser raised $10,000 and did not have any generous benefactors matching small donations - someone called Michael “Valentine” Smith left CFAR in 2018, posted a long essay about how he thought obsessing about AI doom in the future was a way not to think about past traumas, and is back to posting profound anthropological insights [https://www.lesswrong.com/posts/nvmfqdytxyEpRJC3F/is-being-sexy-for-your-homies] from his love life to LessWrong: > As far as I know, every culture throughout all known history has made a point of having men and women act as two mostly distinct social clusters most of the time. Talking about cults and cranks is one angle, but I think you could also talk about how a majority of the leadership of LW and LW-adjacent organizations seem sleazy and dangerous to be around. I hope more people manage to break all the way free from them, rather than quitting CFAR and marrying an OpenPhil staffer, or leaving MIRI and launching their own apocalyptic movement.

By Rationalist standards, Aella and her friends are good at asking “what could go wrong and how could we mitigate it?” I think most attendees understand “I might get bruises or herpes” better than most donors to MIRI understand that Yudkowsky might take $600k a year and just tweet some more. I would not trust them on the implementation!
I forgot that Ivy Astrix says that someone called chaosprime who was active in Vibecamp and TPOT (a rationalist-adjacent Twitter community with a summer camp) turned out to be a convicted child molester. Funny how that keeps on happening!
Reflecting On a Few Very, Very Strange Years In Silicon Valley's Rape Culture

On rape culture in Silicon Valley and its subcultures: Rationalism, Effective Altruism, TPOT, and Vibecamp. Also the informal rules that dictate how women are allowed to talk about sexual violence.

The Asterisk
My first degree was a professional degree, so after college I went out and got a paid job doing that, using the experience I had developed in paid summer jobs. Even when I was young I think I would have said no to Leverage Research.

I think that some 1980s and 1990s fantasy like The Mists of Avalon and a sprawling unfinished series has the “women are magical, men are not” trope. The author of Mists covered up for a pederast. I did not know this idea also comes up in Satanism.

Some forms of sex magic have the idea that men need a female ritual partner to be complete, which echos Scott Aaronson’s idea that he needs a woman to bear his babies to be complete and in a just world he would be the chief rabbi and the peasants and craft workers would give him their prettiest daughter.

Someone called Lennox posted some thoughts on “ideological drift” or Of Marx and Moloch where he seems to say that his time in Effective Altruism was channelling his frustration that he had not pair-bonded yet. Even though he sees many problems with EA, like the gigantic probabilistic models with incredible numbers of assumptions layered on assumptions which can only ever approve of a limited range of interventions, he still seems to think their approach is right.

Someone shared a list of incidents at these parties with me but I don’t have the link handy and I would be uncomfortable posting it anyways. Right now we have the stories by people close to Aella that they are responsible and fun, and the Buddhist poster who says there are drugs and unclear boundaries. I would not recommend that anyone explore risky things with our friends.

Edit, was the sprawling series Wheel of Time?

Of Marx and Moloch: How My Attempt to Convince Effective Altruists to Become Socialists Backfired Completely

Why psychology explains politics better than politics explain psychology

Lennox

Back and forth a few years ago on the SlateStarCodex subredit, roughly:

Scott Alexander: Bay Area rationality is wonderful, we have foundations and group homes and jolly social activities and a Solistice ritual and even “Reciprocity and Propinquity: two different rationalist dating/matchmaking services”

Rando:

I don’t know, I live in a nice community in a different city where people I know have lots of Shabbat dinners, choirs, board game nights, discussions, etc. And zero people I know have joined a cult, and one person I know has developed psychosis, but she had a family history of psychosis, starting having symptoms in early adulthood, and pretty quickly went on antipsychotics and got a lot better.

Is it just that California attracts weird shit and if you put people in California, whatever they’re already doing will get culty?

Alexander: base rates! how do your demographics compare to ours?

Rando:

Probably similar size and age? Nearly everyone I knew has parents who are teachers/lawyers/doctors/therapists/etc, so I guess upper middle class according to that book you wrote about a while ago.

It’s not like everyone’s doing great, lots of people have depression and anxiety and probably smoke more weed than is good for them. Most of those people already had those problems from their adolescence.

But our rates of weird problems, like multiple people with overlapping psychoses tied to some guy, are low.

Has the person in the comments musing about whether women like to quantity things ever watched a mother with small children or a retired woman on a fixed income juggling membership rewards points, coupons, and competing places to shop?