Posts

Sorted by New

Wiki Contributions

Comments

If you have a meta belief that none of your beliefs are certain, does that make all your beliefs celiefs?

That argument doesn't work well.in its own terms: we have extinguished far fewer species than we have not.

So humans are "aligned" if humans have any kind of values? That's not how alignment is usually used.

The Orthogonality Thesis asserts that there can exist arbitrarily intelligent agents pursuing any kind of goal.”

The Ortogonality Thesis is often used in a way that "smuggles in" the idea that an AI will necessarily have a stable goal, even though goals can be very variewd. But similar reasoning shows that any combination of goal (in)stability and goallessness is possible, as well. mindspace contains agents with fixed goals, randomnly drifting goals, corrigble (externally controlable goals) , as well as non-agentive minds with no goals.

We must always start with the simplest possible explanations for the phenomena that surround us.

Why?

The fewer components, abstractions, or entities required for a hypothesis, the better the hypothesis.

Why?

(Not doubting Occam's razor, pointing out that it needs an explanation).

There is more than one way to correctly describe reality.

That goes against he law of non-contradiction: if the two ways are different, they cannot both be correct.

Newton’s theory was nominally refuted by Einstein’s relativism, but this did not stop it from working

"Working" means making correct predictions, not describing reality.

However, Stephen Hawking suggests instead that we consider them all true: that a theory accurately describes the fundamental nature of things is of less importance to us than that it gives us reliable mechanisms for interacting with reality.

How important something is depends on ones values.

“All models are wrong, but some of them are useful.”

...is the opposite of "There is more than one way to correctly describe reality.". Unless you start changing the meanings of "works"/"useful" versus "true"/"describes reality".

PS. Nothing to say about induction?

It's not two things, risk versus safety, it's three things: existential risk versus sub-existential risk versus no risk. Sub existential risk is the most likely on the priors.

That would be a philosophical problem...

TAG15d1213

Truth and Feelings can be reconciled, so long as you are not extreme about either.: if your true beliefs are hurtful you can keep them to yourself. Your worldview can be kept separate from your persona. The problem is when you bring a third thing -- the thing known as sincerity or tactlessness, depending on whether or not you believe in it -- into the picture. If you feel obliged to tell the truth, you are going to hurt feelings.

This used to be well known, but is becoming unknown because of an increasing tendency to use words like "truth and "honesty" in a way that encompasses offering unsolicited opinions in addition to avoid lying. If you can't make a verbal distinction, its hard to make a conceptual one.

He visibly cared about other people being in touch with reality. “I’ve informed a number of male college students that they have large, clearly detectable body odors. In every single case so far, they say nobody has ever told them that before,” he wrote. (I can testify that this is true: while sharing a car ride with Anna Salamon in 2011, he told me I had B.O.)[21]

Well, that goes beyond having true beliefs and only making true statements.

"Read the sequences....just the sequences"

Something a better , future version of rationalism could do is build bridges and facilitate communication between these little bubbles. The answet-to-everything approach has been tried too many times.

Either the Tao can influence the world in the present, in which case the conditioners can never *really *prevent it from reasserting itself; or it can’t, in which case how did we first find it anyway; or it controlled the beginning as first cause in which case whatever happens anywhere ever is what it intended; or it intended something different but it’s not very good at it’s job.

Or it influences the world in proportion to how much it is recognised, and how much you influence the world is proportional to how much you recognise it. The Tao that controls you is not the Tao: the Tao you control is not the Tao. The Tao that does everything is not the Tao; the Tao that does nothing is not the Tao.

Load More