Wiki Contributions


Visual Mental Imagery Training

My visualization ability improves the closer I am to sleep, being near perfect during a lucid dream.

Falsifiable and non-Falsifiable Ideas

You can generally throw unfalsifiable beliefs into your utility function but you might consider this intellectually dishonest.

As a quick analogy, a solipsist can still care about other people.

A Series of Increasingly Perverse and Destructive Games

I escape by writing a program that simulates 3^^3 copies of myself escaping and living happily ever after (generating myself by running Solomonoff Induction on a large amount of text I type directly into the source code).

I attempted the AI Box Experiment (and lost)

I'm guessing Eliezer would lose most of his advantages against a demographic like that.

I attempted the AI Box Experiment (and lost)

Oh god, remind me to never play the part of the gatekeeper… This is terrifying.

Some scary life extension dilemmas

The lifespan dilemma applies to all unbounded utility functions combined with expected value maximization, it does not require simple utilitarianism.

New censorship: against hypothetical violence against identifiable people

Would your post on eating babies count, or is it too nonspecific?

(I completely agree with the policy, I'm just curious)

Causal Universes

There are people who claim to be less confused about this than I am

Solipsists should be able to dissolve the whole thing easily.

Gap in understanding of Logical Pinpointing

Thanks, can you recommend a textbook for this stuff? I've mostly been learning off Wikipedia.

I can't find a textbook on logic in the lesswrong textbook list.

Load More