The Center for AI Safety and OpenAI founder Sam Altman say that "Mitigating the risk of extinction from A.I. should be a global priority."

I'm more worried about Natural Intelligence. I'm afraid that Natural Intelligence might lead to a global catastrophe due to built-in dopamine reinforcement cycles pushing it to optimize for local minima around short-term rewards. This could lead to Natural Intelligence using environmental resources in an unsustainable manner bringing about climate change and destroying life on earth.

Or maybe I'm just a conspiracy nut and we should listen to the billionaires.

Possibly unrelated, here's some photos of New York City circa 2023.

Also, press that like button so I can get some short-term dopamine rewards!

#AQI #NewYorkCity #CanadaWildfires #Smoke #Wildfires #ClimateChange #AirQuality #NYC #AI

@LilahTovMoon  Doing my part as a dopamine enabler

It's important. If people don't give me my dopamine boosts on here, I'll have to find it elsewhere! Who knows what I might resort to. Destroying an environment could be fun…

Like my bad jokes or I'll make the world uninhabitable!

@LilahTovMoon I feel like all the ai safety stuff is people telling on themselves. They say ai is this existential risk a la skynet vs some executive replacing their workers with ai
@LilahTovMoon Yes there is more than one problem in the world. That's not an argument against AI safety any more than malaria prevention is an argument against funding the NHS.
@LilahTovMoon Artificial intelligence will never match natural stupidity