LESSWRONG
LW

andrew sauer
40241410
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
Alcohol is so bad for society that you should probably stop drinking
andrew sauer4d10

Unintended pregnancies don't sound like a benefit to me.

Reply
The Bone-Chilling Evil of Factory Farming
andrew sauer12d10

Mark this, anyone who wishes to align AI by training it on human values. What might an intellectually superior AI conclude, is the appropriate way to treat intellectually inferior beings?

If you want a future that isn't hell, reckoning conclusively with this issue, and all issues like it, is an absolute necessity.

Reply
If digital goods in virtual worlds increase GDP, do we actually become richer?
andrew sauer17d10

dating apps where people could signal their wealth by buying the most expensive virtual good available.

 

This is a Molochian race to the bottom, similar to a dollar auction. The items have value to people, but only insofar as they have more of them than others. The people overall are therefore not better off for having these items, because the signalling game is zero-sum; there is only so much of the actual reward they are after.

Reply
Why Reality Has A Well-Known Math Bias
andrew sauer21d10

The problem with arguments like this is that they are typically circular. At the end of the day you are using math to try to show why math is necessary for reasoning or whatever.

Best to just take a few unjustified axioms so that you're honest about the uncertainty at the bottom of any worldview

Reply
Should we aim for flourishing over mere survival? The Better Futures series.
andrew sauer26d10

Does the above chart assume all survival situations are better than non-survival? Because that is a DANGEROUS assumption to make.

Reply
Blackmailers are privateers in the war on hypocrisy
andrew sauer3mo20

Maybe hypocrisy in the sense that someone acts like they agree with the social consensus in order to avoid persecution, when in fact they don't and are doing things which don't conform to it. Legalized blackmail would encourage people to not mind their own business and become morality police or witch hunters even about things which don't actually hurt them or anybody else.

Consider the effect legalized blackmail would have had on the gay community before widespread acceptance for a particularly brutal and relatively recent example

Reply
Human takeover might be worse than AI takeover
andrew sauer4mo43

Keep in mind also, that humans often seem to just want to hurt each other, despite what they claim, and have more motivations and rationalizations for this than you can even count. Religious dogma, notions of "justice", spitefulness, envy, hatred of any number of different human traits, deterrence, revenge, sadism, curiosity, reinforcement of hierarchy, preservation of traditions, ritual, "suffering adds meaning to life", sexual desire, and more and more that I haven't even mentioned. Sometimes it seems half of human philosophy is just devoted to finding ever more rationalizations to cause suffering, or to avoid caring about the suffering of others.

AI would likely not have all this endless baggage causing it to be cruel. Causing human suffering is not an instrumentally convergent goal. So, most AIs will not have it as a persistent instrumental or terminal goal. Not unless some humans manage to "align" it. Most humans DO have causing or upholding some manner of suffering as a persistent instrumental or terminal goal.

Reply
Does this game have a name?
Answer by andrew sauerApr 12, 202510

This is equivalent to the game Westley played with Vizzini. You know, if Westley didn't cheat. I like to call it "Sicilian Chess" for that reason, though that's just me.

Reply
LWLW's Shortform
andrew sauer5mo93

Trump shot an arrow into the air; it fell to Earth, he knows not where...

Probably one of the best succinct summaries of every damn week that man is president lmao

Reply
Love is Love, Science is Fake
andrew sauer5mo10

LOL @ the AI-warped book in that guy's hands

Reply
Load More
13Curriculum of Ascension
10mo
0
129The case against AI alignment
3y
110
3A simulation basilisk
4y
1
11Torture vs Specks: Sadist version
Q
4y
Q
12