8 Comments
May 26Liked by Duncan Sabien

> This raises a fascinating question, which Moreton discusses but doesn’t fully answer, about how society (or a smaller set of individuals, or a single person) decides which wants are valid, and thus morally endorseable.

My kinda-hot take on this (with which I'm curious to what extent you agree/disagree):

In my ideal society, literally all wants are theoretically endorsable, but some are sufficiently costly to accommodate, that, in practice, they're just not really worth it to accommodate.

I'm reminded of a news story I read somewhere (idk if it's true, but, for the point I'm making here, its truth/falsity is less important than its value as a thought experiment) about someone who really wanted to have the experience of killing someone, so she (I'm about 60% confident the two main individuals involved were both female, so I'm just going to go ahead and use female pronouns for both) found a stranger who was very old and likely wouldn't have lived much longer anyway, killed her, and turned herself in to the police.

I predict most people's main emotional reaction to this would be some combination of horror and/or disgust and/or cold/ironic amusement. I felt some amount of these things, but mostly I felt sad that this killer - who is clearly *not* merely a selfish asshole, judging by her effort to minimize the harm done by her actions, and by her willingness to accept the consequences - felt that the least shitty option available to her was to forcibly kill someone and end up in prison herself. In what I would consider a perfect world, it would be socially and legally permissible for two people to make an agreement in which one grants the other permission to kill them without punishment (e.g. "you kill me, and, in exchange, you give all my kids/grandkids/whatever $X", or "you give me $X now, and, in exchange, you get to kill me Y years from now"). Sadly though, if we took the world as it exists and made such agreements legally permissible without changing anything *else*, that change would likely not be an improvement. Perhaps some people who want to experience killing would get to experience it with the consent of their victims and without punishment, which I would consider a benefit, but that benefit would likely be outweighed by the fact that any method to legally establish consent to be killed would necessarily be imperfect, and this change would make it easier to effectively "get away with murder" via socially manipulating people into consenting to it and/or fabricating evidence that they did consent to it.

Expand full comment
author

Yeah, I largely agree (in actual practice most libertarians seem sketchy or shitty and I avoid them pretty hard but like, the steelman version of libertarian philosophy is something I endorse).

Ditto things like right to suicide, or consensual cannibalism, or the ability to try experimental and untested drugs/treatments for terminal illnesses without hindrance.

Expand full comment
May 28Liked by Duncan Sabien

> a lot of the other language in the chapter kind of pushes the implication that everything is socially motivated and transactional

It's an especially effective social fabric that can actually change people's desires, instead of just suppressing some.

Expand full comment
May 27Liked by Duncan Sabien

I'm surprised not to find the word "fair" in your essay, though perhaps it shows up in the source book. (I wasn't able to find a copy, but I'll keep my eyes out for one. 😉) It seems to me, for example, that there's a very natural frame that suggests that the green guy in section 3 is getting an unfair deal, in that the gains-from-civilization are being disproportionately allocated to the others. I thought perhaps this would be addressed in the Veil of Ignorance as about righting the natural inequality of nature through acausally purchasing insurance (e.g. https://utopiandreams.substack.com/p/basic-income-part-1-welfare), but that might be beyond the 7th-grade level. Nevertheless, basic concepts of fairness seem integral to an understanding of social dynamics, even for children.

Or at least, they do to me. But I'm also confused about the degree to which civilization is simply isomorphic to Those Who Follow True Ethics, so perhaps this simply hits that same confusion.

Regardless, I'll leave this here, as potentially related: https://utopiandreams.substack.com/p/fairness

Expand full comment
author

Unrelatedly, Logan was imagining a future scenario in which there were multiple kids running around, and one of the kids coming up screaming "he did [whatever], it's not fair!" and being unable, as a parent, to give the kid what they actually wanted, b/c of a primary response of something like "idk, what's good about 'fair'/explain to me why that's something I should make happen"

Expand full comment
May 27Liked by Duncan Sabien

Also, obligatory: orbits *really* don't work like that in real life, and I resent the potential confusion introduced in readers (incl. children) of Civilization & Cooperation who don't know better.</nitpick>

Expand full comment
May 26Liked by Duncan Sabien

Such a fantastic essay, thank you!

Expand full comment
May 26Liked by Duncan Sabien

Great read; I was reminded of Kevin Simler's essays "UX and the civilizing process" (https://meltingasphalt.com/ux-and-the-civilizing-process/) and "Personhood: a game for two or more players" (https://meltingasphalt.com/personhood-a-game-for-two-or-more-players/).

Expand full comment