@gwynnion I think the goal of longtermism is to justify one's personal preferences behind a veneer of objectivity.
The fact is, nobody can tell the future, so pretending like you know what will maximize the human population in a million years is a level of hubris only attainable for those have been given (uneared) the wealth of literal gods.
In a way, it's pretty sad...they can't just say to themselves "this is what I want so I'll make it happen"...they have to invent some pseudo-scientific justification for themselves.
Money corrupts....but it also weakens...billionaires are the weakest of us.
@gwynnion yeeeeeep.
“The real problem isn’t climate change, which would require us to cut into our profit margins to address now, but sone hypothetical threat that may be presented by Roko’s Basilisk in 500 years!”
@gwynnion Oh, don't forget the ways that it normalizes and justifies eugenics and ethnic cleansing and genocide. After all, if the good of future generations matters far more than any pain, suffering, or crimes committed in the present...
The philosophy is basically just waiting for people to fill in the blanks with their preferred "solutions" to future "problems".