A while back, recovering from surgery and desperate for books, and I was tricked by Audible's recommender system into buying Jordan Peterson's book. In my defence, I didn't know who he was then. I think before the end of chapter one, I went from ‘lobsters had complex social hierarchy, interesting!’ to ‘OMG, I’m inadvertently reading trash!' It was the only time I've asked for a refund from Amazon, and I escaped unscathed, and wiser.

Fast forward to recently, when I spotted #WilliamMacAskill's #WhatWeOweTheFuture, which is all about how we owe it to future generations to not screw up the present. It recognises that humanity can now drive itself to extinction, and tries to figure out some strategies for avoiding that.

It seemed like it might press my armageddon fascination buttons but offer some rays of hope, and maybe suggest concrete actions.

Initially, it played out like that:

WWOTF: OMG for the first time in history we can annihilate ourselves in multiple ways!
Me: Yep!
WWOTF: We should try to avoid that!
Me: I don't want my descendants living out Mad Max. How can I be a Good Ancestor?
WWOTF: The nuclear threat hasn't gone away!
Me: I know! Philomena Cunk does too: https://www.youtube.com/watch?v=5zabCBnUHLA
WWOTF: Climate Change!
Me: Yeah, I have to scroll quickly past COP27 headlines to preserve my mental health…
WWOTF: Pathogens!
Me: Topical.

Philomena Cunk finds out that nuclear weapons still exist

YouTube
WWOTF: AI apocalypse!
Me: … well. I mean, not ‘the singularity’ like Terminator or Matrix. …but yeah, it's locking in some bad prejudice, and is being used as a tool for evil. So, yeah, kinda.
WWOTF: What doesn't kill us might nevertheless destroy civilisation, it's happened before!
Me: Sure. Kinda like the last half of Threads, right? https://youtu.be/FDmrFjQFQ38
Threads (1984) | Post-Nuclear War Harvest

YouTube

I mean, I know philosophers are, and should be, open to considering things that seem ridiculous, and maybe I should charitably assume that there's an intelligent pay-off at the end.

But I'm getting that sinking feeling that's I'm a sucker tricked into reading trash again.

Has anyone out there finished #WhatWeOweTheFuture, and thinks I should forge ahead?

Wow! A timely toot from the amazing @timnitGebru has led me to this confirmation of my suspicions:

https://www.currentaffairs.org/2021/07/the-dangerous-ideas-of-longtermism-and-existential-risk

I think I can stop wasting my time on #WhatWeOweTheFuture

The Dangerous Ideas of “Longtermism” and “Existential Risk” ❧ Current Affairs

<p>So-called rationalists have created a disturbing secular religion that looks like it addresses humanity’s deepest problems, but actually justifies pursuing the social preferences of elites.</p>

Current Affairs

@timnitGebru @robertfromont It does not feel like a coincidence that longtermism (as a philosophy) effectively minimizes any negative externalities of the billionaire lifestyle as a rounding error.

Poaching a tiger? Well, from a long-term view the tiger was going to die anyway.

@SamTheGeek @timnitGebru @robertfromont This, and also the distraction from real problems. EA figured out that malarial bed nets are a really good use of charity money, and then the longtermists came in and decided it was less exciting than arguing over how many tech founders can dance on the head of a pin.

@Alon @SamTheGeek @timnitGebru @robertfromont In the end you can't do EA without choosing a political orientation. The movement ended up dominated by libertarians who believe they can spend money better than the government.

Would be cool to have an explicitly socdem or social liberal EA. A good movement should repel all but the most moral minority of billionaires.

@DiegoBeghin @SamTheGeek @timnitGebru @robertfromont Yeah, on Birdsite, I stunned a libertarian who just went ahead and assumed that weakening the state was good for fighting existential risks; I pointed out that every single success against corona was handled by a state (such as Taiwan), whereas the tech industry, even when it was right (e.g. Balaji Srinivasan), couldn't actually solve the problem, just reduce its own exposure.