I wondered about this today, googled it, and should not be surprised that Scott Alexander thought about it years ago:)
A couple of thoughts, very late to this discussion.
First, perhaps human consciousness is highly individuated, so each human counts for one, when we’re reasoning anthropically. But if there are hive-minds, then maybe thousands of ants count for only one. Perhaps even Norway rats are similar enough to each other that, though they’re not a hive mind, they have less anthropic weighting. Perhaps the proper reference class is types of consciousness, and the more individuated a consciousness is, the more it is its own type, so the more anthropic weighting it receives.
The individual rat is sentient, but the anthropic question is: which type of consciousness are you likely to be?
My second thought is that humans are vastly outnumbered on Earth by non-human sentients, but Earth sentients, human and non-, are super-vastly outnumbered by all the sentients across the Multiverse and its Sims. So: I’m unlikely to be a human, but am only slightly more likely, all sentients considered, to be a sentient Earthling. The probability gap between human and non-human Earthling just isn’t that great, relative to the probability gap between Earthling and non-Earthling.
Actually, if I may, a third thought: Why not reason as follows, to include animals among the sentient: I am a sentient lifeform; this is highly unlikely if non-living things [e.g. every grain of sand and every water molecule] are sentient also; therefore only living things are sentient.
I wondered about this today, googled it, and should not be surprised that Scott Alexander thought about it years ago:)
A couple of thoughts, very late to this discussion.
First, perhaps human consciousness is highly individuated, so each human counts for one, when we’re reasoning anthropically. But if there are hive-minds, then maybe thousands of ants count for only one. Perhaps even Norway rats are similar enough to each other that, though they’re not a hive mind, they have less anthropic weighting. Perhaps the proper reference class is types of consciousness, and the more individuated a consciousness is, the more it is its own type, so the more anthropic weighting it receives.
The individual rat is sentient, but the anthropic question is: which type of consciousness are you likely to be?
My second thought is that humans are vastly outnumbered on Earth by non-human sentients, but Earth sentients, human and non-, are super-vastly outnumbered by all the sentients across the Multiverse and its Sims. So: I’m unlikely to be a human, but am only slightly more likely, all sentients considered, to be a sentient Earthling. The probability gap between human and non-human Earthling just isn’t that great, relative to the probability gap between Earthling and non-Earthling.
Actually, if I may, a third thought: Why not reason as follows, to include animals among the sentient: I am a sentient lifeform; this is highly unlikely if non-living things [e.g. every grain of sand and every water molecule] are sentient also; therefore only living things are sentient.