Anonymous48

Posts

Sorted by New

Wiki Contributions

Comments

An Especially Elegant Evpsych Experiment

At first I thought that it'd be the expected number of kids one would have over the rest of their life, but I don't see how that could go ever go up.

An adult can start having kids right now, whereas infant has to survive to adulthood first.

(Moral) Truth in Fiction?

Not enough sex and explosions in 3WC? Are you joking?

Oh, and it would be easier to find someone make it into a good visual novel rather than a good movie.

Normal Ending: Last Tears (6/8)

It's rather funny to see this end desribed as awful by Eliezer, who, at the same time, endorses things such as In my head I have an image of the parliament of volitional shadows of the human species, negotiating a la Nick Bostrom. The male shadows and the female shadows are pretty much agreed that (real) men need to be able to better read female minds; but since this is a satisfaction of a relatively more "female" desire - making men more what women wish they were - the male shadows ask in return that the sex-drive mismatch be handled more by increasing the female sex drive, and less by decreasing male desire...

So, intraspecies convergence of values is somehow ok, but interspecies isn't?

Interpersonal Entanglement

In my head I have an image of the parliament of volitional shadows of the human species, negotiating a la Nick Bostrom. The male shadows and the female shadows are pretty much agreed that (real) men need to be able to better read female minds; but since this is a satisfaction of a relatively more "female" desire - making men more what women wish they were - the male shadows ask in return that the sex-drive mismatch be handled more by increasing the female sex drive, and less by decreasing male desire...

What happens to those who absolutely refuse to accept a "few psychological nudges" done to themselves? They obtain the benefit of species-wide correction yet either don't contribute to the satisfaction of the other sex or are forced into it.

Sorry, my overemphasized antiauthoritarian emotional module had to bring that up.

Building Weirdtopia

Economic: everyone owns a nanoassembler but mass production and copy operations are prohibited. Sexual: age of consent is raised to 600 years.

Eutopia is Scary

restoration of travel time / communication delay

Communication delay as in IP over Avian Carriers? Next thing you know we'll be living in our EEA and have to walk 10 km uphill to school, both ways.

but the question I have to ask is "Will this drive more than 5% of my reading audience insane?"

So you're fine with that amount of collateral damage? :) Seriously, make a "mental stability" questionnaire and distribute your story under NDA to those who pass. Consider me interested too.

Growing Up is Hard

Eliezer:Sometimes changing yourself the wrong way, and being murdered or suspended to disk, and replaced by an earlier backup.

Uh, no. If restoration from backup happens shortly after the wrong change I'd think of it as a day you wasted and don't remember, definitely not a murder. Something only weakly to be avoided.

Unknown:This actually happens to us all the time, without any modification at all, and we don't care at all, and in fact we tend to be happy about it,

Most don't care. I for one am kind of worried about my personal goal drift and what my (natural) future brain neurochemistry changes could do to current values.

Nonperson Predicates

I'd like to second what Julian Morrison wrote. Take a human and start disassembling it atom by atom. Do you really expect to construct some meaningful binary predicate that flips from 1 to 0 somewhere along the route?

EY:What if an AI creates millions, billions, trillions of alternative hypotheses, models that are actually people, who die when they are disproven? If your AI is fully deterministic then any its state can be recreated exactly. Just set loglevel of baby AI inputs to 'everything' and hope your supply of write-once-read-many media doesn't run out before it gets smart enough to provably friendly discard data that isn't people. Doesn't solve the problem of suffering, though.

Suppose an AI creates a sandbox and runs a simulated human with a life worth living inside for 50 subjective years (interactions with other people are recorded at their natural borders and we don't consider merging minds). Then AI destroys the sandbox, recreates it and bit-perfectly reruns the simulation. With the exception of meaningless waste of computing resources, does your morality say this is better/equivalent/makes no difference/worse than restoring a copy from backup?

High Challenge

On the subject of MMORPGs, I've enjoyed playing one for about a year and then it stopped being fun. In the beginning the interesting part was the world exploration, acts of learning new things, rules and interactions between various parts of the system, feeling of steady advance towards some clearly defined goal and work that was guaranteed to pay off. After a while grinding has started to become annoying and my interest shifted towards minmaxing everything and writing complex scripts to allow bots do the boring parts. Then realisation hit me. In real world an extraordinarily efficient way of doing things is good, it's called an invention. In a game it is called cheating. Nature doesn't care what smart tricks you used to achieve your goals. In a game if it wasn't anticipated by developers it probably counts as an exploit. The universe has a set of unchanging rules, a game is perpetually balanced by series of patches and crutches in unpredictable places. By being creative you are fighting against game developers, which is pointless because they will actively oppose and get rid of you through their control over the sandbox. You aren't expected to have hacker (in the original meaning of the word) kind of fun in such games.