All Posts

Sorted by Magic (New & Upvoted)

Monday, January 6th 2020
Mon, Jan 6th 2020

Personal Blogposts
Shortform [Beta]
8George19dI recently found it fun to think about the idea of whether or not there are separate consciousnesses in a single brain. There's the famous example of a Corpus callosotomy producing split-brain people, where seemingly tow rational but poorly-communicating entities exist within the brain. I think many people may get the intuition that it's likely that both entities in this case (both hemispheres) are conscious in some way. People also get the intuition that animals with brain processes far different from ours (rats, cats, cows... etc) may experience/produce something like consciousness. Even more so, when comma patients wake up and tell story of being conscious during the comma, just unable to act, we usually think that this is also a form of experience similar to what most of all call consciousness (if not exactly the same). Yet there doesn't seem to be a commonly-shared intuition that our own brain might harbor multiple conscious entities, despite the fact that there's nothing to indicate the contrary. Indeed, I would say that if our intuitions go something like: 1. Larger than {x} CNS => consciousness 2. Splitting up CNS of size 2*{x} into two tightly linked bits => 2 consciousness 3. Consciousness does not require a define-able pattern to exist, or at least whatever pattern is required doesn't seem to be a consistent opinion between people I can see no reason why those intuitions couldn't be strained to say that it is plausible and possibly even intuitive for there to be 2, or 3 or n conscious experiences going on within a brain at the same time. Indeed, I would say it might even be more likely for my brain to have, say, 5 conscious experiences that don't intersect going on at the same time, than for a rat with a brain much less developed than mine to have a single conscious experience. Granted, I think functional mris data might have a few things to opine about the former being less likely than the later, but there's still nothing fundamentally differ
8FactorialCode19dI've been thinking about arxiv-sanity lately and I think it would be cool to have a sort of "LW papers" where we share papers that are relevant to the topics discussed on this website. I know that's what link posts are supposed to be for, but I don't want to clutter up the main feed. Many of the papers I want to share are related to the topics we discuss, but I don't think they're worthy of their own posts. I might start linking papers in my short-form feed.
1Pattern19d"The No True Scotsman fallacy" is often cited when people do things like defining X not as Y, but Y when Y works.* This is the (explicit) ideal (that people may admit to). While those asking "What is X" are probably interested in "When does Y work?", if X/a group that defines itself based on X (and refers to itself with the label 'X'), then since their goal is to achieve that ideal, they themselves would very much like to know/and are working on "what is necessary to make Y work/happen?". Thus 'Y is not working' may be (seen as) a criticism of (the group) X - and spark some debate. (The ideal may be a motte and bailey, or fake.) To make this more concrete here is an example: "Rationality is about winning." (I'm still waiting for the "X is not about Y" article "Rationality is not about winning".) What other things are (or can be) defined in terms of 'when they work'? *Or more specifically (see the wikipedia article [https://en.wikipedia.org/wiki/No_true_Scotsman]) 'people who like X' think of 'the examples of X they like' when they hear 'X'.