190

LESSWRONG
LW

189
Animal EthicsEthics & MoralitySuffering
Personal Blog

9

Valuing Sentience: Can They Suffer?

by jefftk
29th Jul 2013
1 min read
29

9

9

Valuing Sentience: Can They Suffer?
24Eneasz
11DanielLC
1aelephant
5ThisSpaceAvailable
1DanielLC
0Desrtopa
21Lumifer
1ChristianKl
0A1987dM
8Mestroyer
6Pablo
3jefftk
5Eneasz
3beth
-8Locaha
1Rukifellth
4asr
2Rukifellth
0ThisSpaceAvailable
-1Locaha
1Rukifellth
-3Locaha
0ThisSpaceAvailable
1Desrtopa
0Rukifellth
0Locaha
0Rukifellth
0mwengler
-19[anonymous]
New Comment
29 comments, sorted by
top scoring
Click to highlight new comments since: Today at 3:58 PM
[-]Eneasz12y240

Can we taboo "Suffer"? Because at this point I'm not even sure what that means. Is it "a biological signal that identifies damage"? That seems too simple, because most sophisticated machines can detect damage and signal it, and we don't particularly worry ourselves about that.

Catch-22 re God & pain:

Oh, He was really being charitable to us when He gave us pain! Why couldn't He have used a doorbell instead to notify us, or one of his celestial choirs? Or a system of blue-and-red neon tubes right in the middle of each person's forehead. Any jukebox manufacturer worth his salt could have done that.

Listening to RadioLab they described a wasp who's midsection had been accidentally crushed. As it was dying it began to eat it's own viscera. Likely because it detected a rich food source and began executing the standard action when in the presence of a rich food source. It was at this point that I finally intuitively understood that insects are simply biological replicating machines. I cannot think of them as feeling anything akin to suffering any more, merely damage-avoidance subroutines.

It seems we're concerned about the capacity of a mind to experience something it wants to avoid. Doesn't that imply that the complexity of the mind is a factor?

Reply
[-]DanielLC12y110

Can we taboo "Suffer"? Because at this point I'm not even sure what that means.

We cannot, for the same reason we can't taboo consciousness. None of us are sure what it means.

All I can say is that it's the sucky part of consciousness.

Reply
[-]aelephant12y10

It sucks to experience it personally, but maybe it serves an evolutionary purpose that we don't yet fully understand & eliminating it completely would be a mistake?

Reply
[-]ThisSpaceAvailable12y50

"It serves an evolutionary purpose" and "eliminating it completely would be a mistake" are two completely different claims. While there is correlation between evolutionary purposes and human purposes, the former has no value in and of itself.

Reply
[-]DanielLC12y10

It serves an evolutionary purpose that's pretty obvious and eliminating it entirely would cause a lot of problems. We can still find a way to improve the status quo though. We didn't evolve to maximize net happiness, and we're going to have to do things we didn't evolve to do if we want to maximize it.

Reply
[-]Desrtopa12y00

I think we already have more than an inkling of the usefulness of suffering over warning signs which are less burdensome to experience. It can be awfully tempting to override such warning signs when we can.

Imagine a group of hunters who're chasing down a valuable game animal. All the hunters know that the first one to spear it will get a lot of extra respect in the group. One hunter who's an exceptional runner pulls to the head of the group, bearing down on the animal... and breaks a bone in his leg.

In a world where he gets a signal of his body's state, but it's not as distressing as pain is, he's likely to try to power on and bring down the game animal. He might still be the first to get a spear in it, at the cost of serious long term disability, more costly to him than the status is valuable.

The hunter's evolutionary prospects are better in a world where the difficulty in overriding the signal is commensurate with the potential costs of doing so. If attempting to override such signals were not so viscerally unpleasant, we'd probably only be able to make remotely effective tradeoffs on them using System 2 reasoning, and we're very often not in a state to do that when making decisions regarding damage to our own bodies.

Reply
[-]Lumifer12y210

Let me offer a similar scenario that has the advantage of reality: we can implement it right now without waiting for future research.

We know where the pleasure centers of rats are. We can implant electrodes into these centers and stimulate them leading to rats being in, more or less, perpetual state of ecstasy.

We can right now create Happy Rat Farms where rats' brains are electrically stimulated to experience lots and lots of great pleasure.

Is it valuable to create Happy Rat Farms?

Reply
[-]ChristianKl12y10

Or alternatively:

Should we wirehead those rats we use for toxicity testing of new medicaments?

Reply
[-]A1987dM12y00

We can right now create Happy Rat Farms where rats' brains are electrically stimulated to experience lots and lots of great pleasure. [emphasis in the original]

Sure we could, but how much would it cost us? Isn't there anything better we could do with the same amount of resources?

Reply
[-]Mestroyer12y80

If Omega explained it was about to take out its super-scalpel and give me an incredibly precise lobotomy, which would take away some abilities of my mind, but not all, and there was nothing I could do to escape it, and afterwards Omega would poke the remnant of me with hot irons for a few days before killing me, but I could pay in advance to escape the hot irons, and the same offer was given to everyone, regardless of what Omega had predicted that they would choose...

If the lobotomy would take away my ability to appreciate complex forms of beauty, humor, and camaraderie, or my ability to form or comprehend English sentences, my ability to contribute to society, my ability to organize my experiences into narratives, my ability write and be persuaded by arguments like this one, my sense of a morality and inclination to act upon it, or my ability to design tools, I would still pay not to be poked with hot irons.

But if I was told that the lobotomy would take away my ability to suffer (And Omega said that by "suffer," it meant whatever familiar yet unidentified processes in my brain I previously attached that word to), I wouldn't care about the hot irons.

Reply
[-]Pablo12y60

Is "can they feel pain" or "can they feel pleasure" really the right question, though? Let's say we research the biological correlates of pleasure until we understand how to make a compact and efficient network of neurons that constantly experiences maximum pleasure. Because we've thrown out nearly everything else a brain does, this has the potential for orders of magnitude more sentience per gram of neurons than anything currently existing. A group of altruists intend to create a "happy neuron farm" of these: are they awesome and inspiring or misguided and creating nothing of value?

I think this is a false dilemma. I don't find that scenario "awesome", but I do believe it would be creating something of value. The reason I believe this is that, when I experience intense pleasure, I can apprehend that these experiences ought to exist in the universe, by virtue of how they feel like. Filling the universe (or a portion of it) with these experiences is therefore a great thing, regardless of how "awesome" or "inspiring" I happen to find it.

Reply
[-]jefftk12y30

I've edited the post to just ask the simpler question of whether this is valuable.

Reply
[-]Eneasz12y50

Only tangentially related, this reminds me of a flash-fiction about vast farms of neuron-less meat, and the conundrum a humane society faces when some of those slabs of meat develop neurons. Great story, only 1000 words. Neither Face Nor Feeling.

Reply
[-]beth12y30

Suffering is an emotional state triggered by desire. Desire is the attachment of value to imagined experiences.

So there's a minimal level of consciousness required to experience suffering, and a neuron farm probably doesn't meet it, that's why it's not morally significant. What sorts of organisms do meet it is another matter.

Reply
[+]Locaha12y-80
[+][anonymous]12y-190
Moderation Log
More from jefftk
View more
Curated and popular this week
29Comments
Animal EthicsEthics & MoralitySuffering
Personal Blog

In the recent discussions here about the value of animals several people have argued that what matters is "sentience", or the ability to feel. This goes back to at least Bentham with "The question is not, Can they reason? nor, Can they talk? but, Can they suffer?"

Is "can they feel pain" or "can they feel pleasure" really the right question, though? Let's say we research the biological correlates of pleasure until we understand how to make a compact and efficient network of neurons that constantly experiences maximum pleasure. Because we've thrown out nearly everything else a brain does, this has the potential for orders of magnitude more sentience per gram of neurons than anything currently existing. A group of altruists intend to create a "happy neuron farm" of these: is this valuable?  How valuable?

(Or say a supervillian is creating a "sad neuron farm". How important is it that we stop them?  Does it matter at all?)