Stupid Questions September 2017

by Erfeyah1 min read15th Sep 201727 comments


Personal Blog

This thread is for asking any questions that might seem obvious, tangential, silly or what-have-you. Don't be shy, everyone has holes in their knowledge, though the fewer and the smaller we can make them, the better.

Please be respectful of other people's admitting ignorance and don't mock them for it, as they're doing a noble thing.

To any future monthly posters of SQ threads, please remember to add the "stupid_questions" tag.

27 comments, sorted by Highlighting new comments since Today at 2:40 PM
New Comment

I was wondering if someone can point me to good LW's article(s)/refutation(s) of Searle's Chinese room argument and consciousness in general. A search comes up with a lot of articles mentioning it but I assume it is addressed in some form in the sequences?

I don't remember if the Sequences cover it. But if you haven't already, you might check out SEP's section on Replies to the Chinese Room Argument.

That is great! Thanks :)

What about it seems worth refuting?

The Zombie sequence) may be related. (We'll see if I can actually link it here.) As far as the Chinese Room goes:

  • I think a necessary condition for consciousness is approximating a Bayesian update. So in the (ridiculous) version where the rules for speaking Chinese have no ability to learn, they also can't be conscious.
  • Searle talks about "understanding" Chinese. Now, the way I would interpret this word depends on context - that's how language works - but normally I'd incline towards a Bayesian interpretation of "understanding" as well. So this again might depend on something Searle left out of his scenario, though the question might not have a fixed meaning.
  • Some versions of the "Chinese Gym" have many people working together to implement the algorithm. Now, your neurons are all technically alive in one sense. I genuinely feel unsure how much consciousness a single neuron can have. If I decide to claim it's comparable to a man blindly following rules in a room, I don't think Searle could refute this. (I also don't think it makes sense to say one neuron alone can understand Chinese; neurologists, feel free to correct me.) So what is his argument supposed to be?

Thanks for the pointer to the zombie sequence. I 've read part of it in the past and did not think it addressed the issue but I will revisit.

What about it seems worth refuting?

Well, the way it shows that you can not get consciousness from syntactic symbol manipulation. And Bayesian update is also a type of syntactic symbol manipulation so I am not clear why you are treating it differently. Are you sure you are not making the assumption that consciousness arises algorithmically to justify your conclusion and thus introduce circularity in your logic?

I don't know. Many people are rejecting the 'Chinese room' argument as naive but I haven't understood why yet so I am honestly open to the possibility that I am missing something.

I repeat: show that none of your neurons have consciousness separate from your own.

Why on Earth would you think Searle's argument shows anything, when you can't establish that you aren't a Chinese Gym? In order to even cast doubt on the idea that neurons are people, don't you need to rely on functionalism or a similar premise?

(I am not sure at all about all this so please correct me if you recognise any inconsistencies)

First of all, I honestly don't understand your claim that neurons have consciousness separate from our own. I don't know but I surely don't have any indication of that...

Why on Earth would you think Searle's argument shows anything, when you can't establish that you aren't a Chinese Gym?

The point is that the brain is not a Touring machine since it does not seem to be digital. A Chinese Gym would still be a syntactic system that uses 'instructions' between people.This is related to the way Giulio Tononi is attempting to solve the problem of consciousness with his Phi theory.

I don't quite get the difference between "any" and "every" (in the more interesting cases.) Does "every 2 out of 30 [things have this property]" mean the set of ordered twos as a whole thing (unlike "any 2 two out of 30", which is talking about any one combination of two things but not all possible combinations taken at once?

And if "every" needs some kind of order, even if we don't know which, and some kind of "presented-togetherness", then we can, for example, say, "[every two out of four] out of [every thirty out of thirty]", but I don't quite understand what it would say...

I mean, it doesn't have to be something trivial like "every apple out of thirty apples has a spot on its side", it can be something like "every node out of thirty is connected to another node". But even this second case does not quite fit. Are there even objects for which "every 2 out of 30" and "every 1 out of 30" are two distinct things?

(and negation is even worse.)

"Every" doesn't need an order.

"For every x, property(x) holds" means "it is not the case that for any x, property(x) does not hold."

"For any x, property(x) holds" means "it is not the case that for every x, property(x) does not hold."

In Russian, quantifier adjectives are often implicit, which could be a part of the problem here. Native Russian speakers (like me) often have problems with this, also with definite vs indefinite articles in English.

edit: not only implicit but ambiguous when explicit, too!

Person below is right, "every" is sort of like an infinite "AND" and "any" is sort of like an infinite "OR."

(Still confused.) Then it is possible to say, in principle, "for every combination of n out of the whole set of n, property(x) hold)" and mean ordered combinations? Is there any other meaning for "every 30 out of 30"?

(yes, it is probably because of my language background. I don't even use the Russian analogues all that often!)

It is possible to say that, but the work is being done by "combination." You can also say "for every permutation of n" and that means something different.

Typically when you say "for every x out of 30, property(x) holds" it means something like:

"every poster on lesswrong is a human being" (or more formally, "for every poster on lesswrong, that poster is a human being." (Note, this statement is meaningful but probably evaluates to false.)

Quantification is always over a set. If you are talking about permutations, you are first making a set of all permutations of 30 things (of which there are 30 factorial), and then saying "for every permutation in this set of permutations some property holds").

edit: realized your native language might be Ukrainian: I think a similar issue exists in Ukrainian quantifier adjectives.

And then, Ukrainian too has всяк/усякий (всякий) that is different from кожен (каждый)... If I were to translate усякий into English distinctly from both "every" and "any", I would probably have to say "of all kinds", but how do you say that about one thing?! anyway, this is silly.

(а мой "исходный" язык - русский + татарский + украинский. Даже не помню, что там в татарском делается.)

Крымская tатарка?

Я одессит, родился в Крыму.

Русская киевлянка, первые 4 класса училась в Казани. Татарский, говорят, сильно отличается от крымско-татарского.

It is very annoying that

любой is translated both as "any" and "every."

какой-либо is closer to formal logical "there exists" or "any."

It is also very annoying that I know damn right what I mean by любой, and so does любой with whom I speak.

Sometimes, it seems to me that English is just too precise. Or maybe it's just me.

In Ukrainian, we have жодний, which means "none of the above" or smth like it... now that's a word worth having!

(this may not help). It is the difference between, "each of..." and, "all of...".

"if we go through each of the set, one at a time..." "if we go through all of the set"

They can be made to mean the same.

what does "any" mean, then?

(yes, I agree that of course it usually means the same in practice, that's why this is a stupid question:) I just... I guess I see "any" as a potentiality, and "every" as realisation... anyway, do you think we can talk about this structures in some more complex way than simple "any one thing out of the collection" and "every one thing..."? What would it mean? I imagine the "[every 2 out of 4] out of [every 30 out of 30]" like something like walked paths.

Edit to add: I just want to know what piece of math it corresponds to, mostly. (Combinatorics,obviously.) And what can be done next to this thing to make it more complicated and still exist, kind of. Like combining "anies" with "everies" in different ways, and, if I were to go crazy all the way, dividing things?

A quick thought; It seems like 'any' is related to the logic function of 'OR' and 'every' is related to the logic function of 'AND'. But likely I'm not totally grokking your question.

Does this thread elucidate anything?

I find it funny people think questions about the Chinese Room argument or induction are obvious, tangential, or silly. xD

Anyway: What is the best algorithm for deciding between careers?

(Rule 1: Please don't say the words "consult 80,000 Hours" or "Use the 80K decision tool!" That is analogous to telling an atypically depressed person to "read a book on exercise and then go out and do things!" Like, that's really not a helpful thing, since those people are completely booked (they didn't respond to my application despite my fitting their checkboxes). Also I've been to two of their intro workshops.)

I want to know what object-level tools, procedures, heuristics etc. people here recommend for deciding between careers. Especially if one feels conflicted between different choices. Thanks! :)

Write your assumptions about the alternatives career paths that you see down.

Talk to other people. If you have multiple options talk with your friends about those and get their feedback. If your friends don't include people who are in those careers reach out for people inside those careers. Cold approaching people over LinkedIn is one way you can get in contact with people and there are likely also networking events.

I don't know what the best algorithm is, but what I did was something like the following.

Step 1. Make a list of the things you enjoy doing. Attempt to be specific where possible- you want to get at the activity that's actually enjoyable, so "making up stories" is more accurate for me than "writing" is, since it's the storytelling part that's fun for me instead of the sitting down and typing specifically. Sort the list in the order that you most enjoy doing the thing, with an eye towards things you can do a lot of. (I do like chocolate, but there's a sharp limit in the amount of chocolate I can eat before it stops being fun.) There's no exact length you need, but 10~15 seems to be the sweet spot.

Step 2. Line up the things you enjoy doing with jobs that do them a lot. Make a list of those jobs, putting under each job the different things you would like about them along with things you know you'd dislike about doing the job. Talking to people in that field, reading interviews with them, and good old fashioned googling are good steps here. Sort the jobs by how many of your favourite things to do are in them and how few things you don't want to do are in them.

Step 3. Take the list of jobs, and look up how much money each job makes, along with how much demand there is for that job and how many qualifications you'd need to earn to reasonably expect to get the job. Hours worked per week and health risks are also good things to think about. (Note: Sitting at a computer for nine hours straight should really count as a health risk. I'm not joking.)

Step 4. You now have a good notion of enjoyment vs practicality. If there's a standout winner in both of them, do that. If not, then consider your tradeoffs carefully. You will probably enjoy things less when you have to wake up every morning and do them, but it also caught me by surprise how little time it feels like I have to work on personal projects after eight or nine hours plus commuting.

Step 5. Think about UBI and cry a little, then dedicate a side project towards ushering in the glorious post-scarcity future.

There is a reason people say 80k. And it's because they did the research already.

If not 80k. Read deep work, so good they can't ignore you and maybe others booms that suggest a "strategy" for employment. (short version - get a job in an area on purpose. Ie if you are a vampire, a job in a factory making garlic free whole foods.)

Ask people around you. Maybe 10. Why they chose their career, and if they like it. Ignore their answers and double check by observing them work.

Figure out what satisfies the three criteria:

  • You like doing this
  • You are good at doing this
  • Other people value this (aka will pay you money for doing this)

What is the probability that induction works?

By Solomonoff induction, the hypothesis that governs the universe under the assumption that induction works has less complexity penalty than one that counts to a number on the order of 10^80 to 10^18000 steps while the universe is running and then starts working differently by a factor of about 10^17 (since that's how many turing machines with 6 states there are, which is the number of states you need to count to that sort of number of steps), so the probability that induction works can be given an upper bound of about 1-10^-17.

Shouldn't a particular method of inductive reasoning be specified in order to give the question substance?