alexey's Posts

Sorted by New

alexey's Comments

Leto among the Machines

I don't believe the original novels imply the humanity nearly went extinct and then banded together, that was only in "the junk Herbert's son wrote". Or that Strong AI was developed only a short time before the Jihad started.

Neither of these are true in the Dune Encyclopedia version, which Frank Hebert at least didn't strongly disapprove of.

There is still some Goodhart's-Law-ing there, to quote

After Jehanne's death, she became a martyr, but her generals continued exponentially with more zeal. Jehanne knew her weaknesses and fears, but her followers did not. The politics of Urania were favored. Around that time, the goals of the Jihad were the destruction of machine technology operating at the expense of human values; but by this point they would have be replaced by indiscriminate slaughter.

Intrinsic properties and Eliezer's metaethics

Whereas I can look at a regular triangle and see its ∆-ness from outside the simulation, I cannot do the same (let's suppose) for keys of the right shape to open lock L.

Why suppose this and not the opposite? If you understand L well enough to see if a key opens it immediately, does this make L-openingness intrinsic, so intrinsicness/extrinsicness is relative to the observer?

And on the other hand, someone else needs to simulate a ruler to check for ∆-ness, so it is an extrinsic property to him.

Namely, goodness of a state of affairs is something that I can assess myself from outside a simulation of that state.

I certainly would consider this much more difficult than merely checking whether a key opens a lock. I could after spending enough time understand the lock well enough for this, but even considering a complete state of affairs e.g. on Earth?

2017 LessWrong Survey

I've taken the survey.

What conservatives and environmentalists agree on

Most leftists ... believe we can all agree on what crops to grow (what social values to have [2])

Whose slogan is "family values", again?

and pull out and burn the weeds of nostalgia, counter-revolution, and the bourgeoisie

Or the weeds of revolution, hippies, and trade unions...

Conservatives view their own society the way environmentalists view the environment: as a complex organism best not lightly tampered with. They're skeptical of the ability of new policies to do what they're supposed to do, especially a whole bunch of new policies all enacted at once.

Bunch of new policies like War on Drugs, for example?

Lesswrong 2016 Survey

I've taken the survey.

The AI That Pretends To Be Human

Second AI: If I just destroy all humans, I can be very confident any answers I receive will be from AIs!

Astronomy, Astrobiology, & The Fermi Paradox I: Introductions, and Space & Time

The amount of line emission from a galaxy is thus a rough proxy for the rate of star formation – the greater the rate of star formation, the larger the number of large stars exciting interstellar gas into emission nebulae... Indeed, their preferred model to which they fit the trend converges towards a finite quantity of stars formed as you integrate total star formation into the future to infinity, with the total number of stars that will ever be born only being 5% larger than the number of stars that have been born at this time.

Is this a good proxy for total star formation, or only large star formation? Is it plausible that while no/few large stars are forming, many dwarfs are?

L-zombies! (L-zombies?)

But my point is that at some point, a "static analysis" becomes functionally equivalent to running it. If I do a "static analysis" to find out what the state of the Turing machine will be at each step, I will get exactly the same result (a sequence of states) that I would have gotten if I had run it for "real", and I will have to engage in computation that is, in some sense, equivalent to the computation that the program asks for.

Crucial words here are "at some point". And Benja's original comment (as I understand it) says precisely that Omega doesn't need to get to that point in order to find out with high confidence what Eliezer's reaction to counterfactual mugging would be.

L-zombies! (L-zombies?)

Suppose I've seen records of some inputs and outputs to a program: 1->2, 5->10, 100->200. In every case I am aware of it was given a number as input, it output the doubled number. I don't have the program's source and or ability to access the computer it's actually running on. I form a hypothesis: if this program received input 10000, it would output 20000. Am I running the program?

In this case: doubling program<->Eliezer, inputs<->comments and threads he is answering, outputs<->his replies.

L-zombies! (L-zombies?)

But I can still do static analysis of a Turing machine without running it. E.g. I can determine a T.M. would never terminate on given input in finite time.

Load More