by ike
1 min read1st Sep 201911 comments
This is a special post for quick takes by ike. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.

New to LessWrong?

11 comments, sorted by Click to highlight new comments since: Today at 1:26 AM
[-]ike3y50

The other day a piece fell off of one side of my glasses (the part that touches the nose.)

The glasses stay on, but I've noticed a weird feeling of imbalance at times. I could be imagining it, I'm able to function apparently regularly. But I was thinking that the obvious analogy is to filmography: directors consciously adjust camera angles and framings in order to induce certain emotions or reactions to a scene. It's plausible that even a very slight asymmetry in your vision can affect you.

If this is true, might there be other low hanging fruit for adjusting your perception to increase focus?

[-]ike5y40

Does the anti-p-zombie argument imply you can't simulate humans past some level of fidelity without producing qualia/consciousness?

Or is there a coherent position whereby p-zombies are impossible but arbitrarily accurate simulations that aren't conscious are possible?

Yes, it implies that. The exact level of fidelity required is less straightforward; it's clear that a perfect simulation must have qualia/consciousness, but small imperfections make the argument not hold, so to determine whether an imperfect simulation is conscious we'd have to grapple with the even-harder problem of neuroscience.

[-]ike5y10

How does it imply that?

I have intuitions on both sides. The intuition against is that predicting the outcome of a process can be done without having anything isomorphic to individual steps in that process - it seems plausible (or at the very least, possible and coherent) for humans to be predictable, even perfectly, without having something isomorphic to a human. But a perfect predictor would count as an arbitrarily accurate simulation.

[-]TAG5y10

Which anti p zombie argument? You can argue that no one has qualia, that physical duplicates necessarily have non physical qualia, that physical duplicates necessarily have physical qualia, etc

[-]ike5y10

The argument that qualia can't be epiphenomenal.

The argument might have been "if qualia it exists, then it probably has observable effects - you without qualia would be different from you with qualia".

[-]ike5y10

But obviously you as a simulation is different in some aspects from you in reality. It's not obvious that the argument caries over.

2) What aspects?

1) You are assuming qualia exists?

[-]ike5y10

Causality is different, for one. You in reality has a causal structure where future actions are caused by the state of you in the present + some inputs. You in the simulation has a causal structure where actions are caused by the simulator, to some extent.

I'm not really assuming that. My question is if there's a coherent position where humans are conscious, p-zombie humans are impossible, but simulations can be high fidelity yet not conscious.

I'm not asking if it's true, just whether the standard argument against p-zombies rules this out as well.

Well if qualia aren’t epiphenomenal then an accurate simulation must include them or deviate into errancy. Claiming that you could accuracy simulate a human but leave out consciousness is just the p-zombie argument in different robes