How probable is human #catastrophe or #extinction over the next century?
☢️ or 🦠 or 🤖 🟰 💀❓
Well, artificial intelligence, nuclear event and engineered pathogens seem to raise the highest concerns.
Check the full report recently published by the Forecasting Research Institute (#OpenAccess). They interviewed *superforecasters*, i.e. historically accurate forecasters on short-run questions, and *experts* on nuclear war, climate change, AI, biological risks, and existential risk more broadly. (Note: be aware that this is an north-american based report.)
https://static1.squarespace.com/static/635693acf15a3e2a14a56a4a/t/64abffe3f024747dd0e38d71/1688993798938/XPT.pdf
The results are fascinating. In general, the domain experts are more negative than the superforcasters: 20% and 6% versus 9% and 1% chances of catastrophe or extinction, respectively.
The Economist put forward a nice graphic with a summary of the results, divided by type of threat.
#future #AI #nuclear #pathogens #apocalypse #superforecasting