Read more: https://en.wikipedia.org/wiki/Longtermism
TIL of Longtermism, an ethical stance which values future people just as much as people alive today, from a longtermist perspective, killing billions of humans today would be perfectly moral if it ensured the survival of trillions of humans a million years from now.
5 Comments
Leave a ReplyLeave a Reply
You must be logged in to post a comment.
Like many (seemingly) logical dogmas it could be used to further some very regrettable ends.
If they value people alive today just as much as future people, I dont think they would murder billions of people
Allowing future people to live also let’s them die which kills them so it is a self defeating argument!
For what it’s worth, I believe FTX’s Sam Bankman-Fried is/was a big proponent of longtermism.
This is the philosophy of SBF.
It’s what mentally allowed him to justify everything.
The philosophy of the 1%ers always ends up seeming to just be whatever justifies inflicting pain on people in the moment.