•

•

Applied to Book review: The Importance of What We Care About (Harry G. Frankfurt) by David Gross 15d ago

A utility function is a function of a sample/probability space, in this case the space whose points (elementary events) are possible worlds. Expected utility is expected value of a utility function over some subspace of a probability space, in this case its definition sums over possible worlds. Talking about people requires a different framing, and to define a utility function that sums over people you need to look inside each possible world.

2mo10

I find confusing the paragraph :

"Not to be confused with maximization of utility, or expected utility. If you're a utilitarian, you don't just sum over possible worlds; you sum over *people*."

It's not clear to me why utilitarianism is not about maximization of expected utility, notably because I guess for a utilitarian, utility and welfare can be the same thing. And it feels pretty obvious that you sum it over people, but the notion of possible worlds is not so easy to interpret here.

22mo

A utility function is a function of a sample/probability space, in this case the
space whose points (elementary events) are possible worlds. Expected utility is
expected value of a utility function over some subspace of a probability space,
in this case its definition sums over possible worlds. Talking about people
requires a different framing, and to define a utility function that sums over
people you need to look inside each possible world.

12mo

But is it important in utilitarianism to think about people ? As far as I can
tell, utilitarianism is not incompatible with things like panpsychism where
there could be sentience without delimited personhood.
Anyway, even if technically correct, I think this is a bit too complicated and
technical for a short introduction on utilitarianism.
What about something simpler, like: "Utilitarianism takes into account the
interests of all sentient beings.". Perhaps we could add something on scale
sensitivity, e.g. : "Unlike deontology, it is scale sensitive.". I don't know if
what I propose is good, but I think there is a need for simplification.

•

•

•

•

•

•

Applied to Theodicy and the simulation hypothesis, or: The problem of simulator evil by philosophybear 9mo ago

•

•

•

•

•

But is it important in utilitarianism to think about people ? As far as I can tell, utilitarianism is not incompatible with things like panpsychism where there could be sentience without delimited personhood.

Anyway, even if technically correct, I think this is a bit too complicated and technical for a short introduction on utilitarianism.

What about something simpler, like: "Utilitarianism takes into account the interests of all sentient beings.". Perhaps we could add something on scale sensitivity, e.g. : "Unlike deontology, it is scale sensitive.". I don't know if what I propose is good, but I think there is a need for simplification.