spamham

Posts

Sorted by New

Wiki Contributions

Comments

Are wireheads happy?

Kevin means this I suppose?

I do know that it is possible for me to mechanically activate the motivation to perform these tasks (and I am on medication that is supposed to help, but I get the feeling it isn't)

Circular Altruism

With all due respect, but this post reminds me of why I find the expectation-calculation kind of rationality dangerous.

IMO examples such as the first, with known probabilities and a straightforward way to calculate utility, are a total red herring.

In more realistic examples, you'll have to do many judgment calls such as the choice of model, and your best estimate of the basic probabilities and utilities, which will ultimately be grounded on the fuzzy, biased intuitive level.

I think you might reply that this isn't a specific fault with your approach, and that everyone has to start with some axioms somewhere. Granted.

Now the problem, as I see it, is that picking these axioms (including quantitative estimates) once and for all, and then proceeding deductively, will exaggerate any initial choices (silly metaphor: A bit like going from one point to another by calculating the angle and then going in a straight line, instead of making corrections as you go. But (quitting the metaphor) I'm not just talking about divergence over time, but also along the deduction).

So now you have a conclusion which is still based on the fuzzy and intuitive, but which has an air of mathematical exactness... If the model is complex enough, you can probably reach any desired conclusion by inconspicious parameter twiddling.

My argument is far from "Omg it's so coldhearted to mix math and moral decisions!". I think math is an important tool in the analysis (incidentally, I'm a math student ;)), but that you should know its limitations and hidden assumptions in applying math to the real world.

I would consider an act of (intuitively wrongful) violence based on a 500-page utility expectation calculation no better than one based on elaborate logic grounded in scripture or ideology.

I think that, after being informed by rationality about all the value-neutral facts, intuition, as fallible as it is, should be the final arbiter.

I think these sacred (no religion implied) values you mention, and especially kindness, do serve an important purpose, namely as a safeguard against the subtly flawed logic I've been talking about.

Are wireheads happy?

Sorry to hear about the drug problems, but how can you be sure they "destroyed" your dopamine neurons? Not all drugs that increase these neurons' activity kill them. Psychological changes might be a simpler explanation IMHO (but I don't know you, so that might be far off the mark).

[...] knock out the wanting to do drugs part of their brain...

Sounds draconian. That part isn't just there for drugs...

Are wireheads happy?

Seems like a pretty large leap from certain simple behaviours of rats to the natural-language meaning of "wanting" and "liking". Far-reaching claims such as this one should have strong evidence. Why not give humans drugs selective for either system and ask them? (Incidentally, at least with the dopamine system, this has been done millions of times ;) The opioids are a bit trickier because activating mu receptors (e.g. by means of opiates) will in turn cause a dopamine surge, too)

(Yes, I should just read the paper for their rationale, but can't be bothered right now...)