All of hrishimittal's Comments + Replies

What expert timing, Luke! Just two days ago, I came across the fascinating practice of clicker training for horses -, while reading Kathy Sierra's old blog -

My only problem is that I need to train my own behaviour rather than someone else's. I'm going to try to use these techniques on myself, although I'm not sure if that's supposed to work. have to make a conscious effort to keep your ideas about what you want from being contaminated by what seems possible.This is isomorphic to the principle that you should prevent your beliefs about how things are from being contaminated by how you wish they were. Most people let them mix pretty promiscuously. The continuing popularity of religion is the most visible index of that.

-- pg

Regarding 2, I think the default setting (Popular) is to display comments as a function of karma and time since posting. As comments get old, newer comments float to the top even if the older ones have some positive karma. If some comment has very high karma, I guess it outweighs the time constraint and stays at the top.

... and the ageing function is tuned for Reddit traffic volumes, so on this site, everything ages too fast and can't stay in popular for very long at all. Open source contributions [] to fix this are welcome.

Ok I don't mind. Richard, your call?

I'd be up for any of the next three Sundays, but this Sunday might be too short notice for other people. So far there's three posting here. Anyone else? Make it a top-level comment so the nesting doesn't drive everything off the main page.

I will come. By usual venue, do you mean 5th View cafe on top of Waterstones bookstore near Piccadilly Circus?

Is there something specific we are going to discuss or is it pretty casual?

I would prefer late morning (say after 11).

I have preference for later hours, as in the mornings people tend to be rather sleepy, but I'll go with whatever majority wants.

Thanks Yvain, you have inspired me to commit to some important things for the next month. I have written them down.

I promise to write about my achievements here on LW on the 18th July.


When giving a security clearance, for example, you would rather give it to someone who loved his country emotionally, than to someone who loved his country rationally;

Can you clarify how you distinguish between loving one's country emotionally as opposed to rationally?

It reminds me very much of this quote attributed to Gautam Buddha:

"Believe nothing merely because you have been told it. Do not believe what your teacher tells you merely out of respect for the teacher. But whatsoever, after due examination and analysis, you find to be kind, conducive to the good, the benefit, the welfare of all beings -- that doctrine believe and cling to, and take it as your guide."

It's interesting speculation but it assumes that people use all of their current intelligence. There is still the problem of akrasia - a lot of people are perfectly capable of becoming 'smarter' if only they cared to think about things at all. Sure, they could still go mad infallibly but it would be better than not even trying.

Are you implying that more IQ may help in overcoming akrasia?

All other things being equal, increasing IQ will make people better at telling the difference between rational argument and sophistry, and at understanding marginally more complex arguments. Decreasing akrasia for the general population is a different issue; the first thought that comes to mind is that increasing people's IQ with fixed motivation ought to improve things.

Yes that's how I meant it.

The True Trolley Dilemma would be where the child is Eliezer Yudkowsky.

Then what would you do?

EDIT: Sorry if that sounds trollish, but I meant it as a serious question.

Shutting up and multiplying, answer is clearly to save eliezer...and do so versus a lot more people than just three...question is more interesting if you ask people what n (probably greater than 3) is their cut off point.
The Yudkowski worship is getting pretty thick around here: [] Lets not turn this into a fandom []
Perhaps you should clarify what angle you're trying to get at with this question. I expect you're raising some version of the "do you value some lives more than others" issue. There are likely at least some people here who would pick Yudkowsky over three unknown people, based on a rational evaluation of expected utility of continued existence. The same issue could be presented by replacing the child with any other person who is expected to have a large positive contribution to the world, such as a promising young surgeon who could potentially save many more than three lives over the course of his career. Or did you have something else in mind?

The bored teenager who finally puts together an AGI in his parents' basement will not have read any of these deep philosophical tracts.

That truly would be a sad day.

Are you seriously suggesting hypothetical AGIs built by bored teenagers in basements are "things which are actually useful in the creation of our successors"?

Is that your plan against intelligence stagnation?

I'll bet on the bored teenager over a sclerotic NASA-like bureaucracy any day. Especially if a computer is all that's required to play.

You make a lot of big claims in this thread. I'm interested in reading your detailed thoughts on these. Could you please point to some writings?

The intro section of my site (Part 1 [], Part 2 []) outlines some of my thoughts regarding Engelbartian intelligence amplification. For what I regard as persuasive arguments in favor of the imminence of petrocollapse, I recommend Dmitry Orlov's blog [] and dead-tree book. As for my thoughts regarding AGI/FAI, I have not spoken publicly on the issue until yesterday, so there is little to read. My current view is that Friendly AI enthusiasts are doing the equivalent of inventing the circuit breaker before discovering electricity. Yudkowsky stresses the importance of "not letting go of the steering wheel" lest humanity veer off into the maw of a paperclip optimizer or similar calamity. My position is that Friendly AI enthusiasts have invented the steering wheel, playing with it - "vroom, vroom" - without having invented the car. The history of technology provides no examples of a safety system being developed entirely prior to the deployment of "unsafe" versions of the technology it was designed to work with. The entire idea seems arrogant and somewhat absurd to me. I have been reading Yudkowsky since he first appeared on the Net in the 90's, and remain especially intrigued by his pre-2001 writings - the ones he has disavowed, which detail his theories regarding how one might actually construct an AGI. It saddens me that he is now a proponent of institutionalized caution regarding AI. I believe that the man's formidable talents are now going to waste. Caution and moderation lead us straight down the road of 15th century China. They give us OSHA and the modern-day FDA. We are currently aboard a rocket carrying us to pitiful oblivion rather than a glorious SF future. [] I, for one, want off.

stagnation is as real and immediate a threat as ever there was, vastly dwarfing any hypothetical existential risks from rogue AI.

How is blindly looking for AGI in a vast search space better than stagnation?

How does working on FAI qualify as "stagnation"?

No amount of aimless blundering beats deliberate caution and moderation (see 15th century China example) for maintaining technological stagnation. It is a distraction from doing things which are actually useful in the creation of our successors. You are trying to invent the circuit breaker before discovering electricity; the airbag before the horseless carriage. I firmly believe that all of the effort currently put into "Friendly AI" is wasted. The bored teenager who finally puts together an AGI in his parents' basement will not have read any of these deep philosophical tracts.

I am convinced that resource depletion is likely to lead to social collapse - possibly within our lifetimes.

What convinced you and how convinced are you?

Dmitry Orlov [], and very.
Thanks, I think it shall be one of the best End Of The World Shows we've ever done. And this is at least number twelve :) For anyone who's never done it, I thoroughly recommend climbing on a soap-box in silly clothes and ranting a load of nonsense at speakers corner like you're an insane nutcase. Liberating for it to be so obvious for a change.

Surely the only point you're making in this long post is not that naïve consequentialism is a bad idea?

consider brainstorming for other goals that you might have ignored, and then attach priorities.

And how exactly does one attach priorities?

I am pretty bad in terms of writing long posts. I have shortened this one considerably given your comment. Let me know if you still think it is too long.
For this to apply in the real world, the players not only have to be rational, they also have to have common knowledge of each others' rationality. E.g. even if you're rational, if you think I'm stupid, and will guess 5, then you should no longer guess zero. Even if I am rational, and everyone else has common knowledge of everyone else's rationality, if they know that you think I'm irrational, then they know that you'll guess higher than zero, so they'll all guess higher than zero, and so on... In general, the more "stupid" people there are, or the more "stupid" people we think there are, or the more "stupid" people we think others think there are, or... the further the average guess is likely to be from 0. So (I assume) the point is to test the assumption of common knowledge of rationality: i.e. how stupid people are, how stupid we think other people are, how stupid we think other people think other people are, etc.

I don't understand how the average guess will be 0. Can you please explain?

You will pick 100. I know that, so I'll pick 66. You know that I know that, so you'll pick 44 instead. But I know that you know that I know that, so I'll pick 29 instead. But you know that I know that you know that I know that, so you'll pick 20 instead. But I know- This continues to infinity until both of our guesses approach 0.
It's the only Nash equilibrium []. The only way everyone can win (and thus, the only way no-one would want to change their guess if they knew all the other guesses) is for all of us to guess a number that is 2/3 of itself: i.e. 0. ETA: CannibalSmith's explanation is better. ETA2: AllanCrossman's is even better.

I've never been actively part of an online community before, so I'm a bit scared to come along. I do find this group interesting though, so I might come to the next meetup.

I don't mind the place as long as it's quiet, but prefer the format to be casual. Except for Tuesday, any day of the week is fine by me.

and when possible, use irrationality for the short run.

How exactly do you use irrationality?

You don't, you use a decision model that incorporates bias.

I'm considering donating to World Vision UK. Does anyone know much about them?

More generally, is there an easy way to find out how good a charity is? Are there reviews done by third parties?

I recommend you check out GiveWell [], they're doing this sort of thing.

I'm in a situation which seems sort of the opposite of yours. I'm with a woman, who's more rational than any other I personally know. But the sex is just not very good, and I find myself getting physically drawn to other women a bit too much. I've struggled for weeks, trying to decide whether to continue or not. I've tried hard to think what I really want. And I think that if I were sexually satisfied, I would be very happy with the relationship because everything else seems perfect. So, I'm trying to work on that now. I'm paying more attention to being a ... (read more)

My advice is first, to talk to her a lot about sex and make it clear how important that is to you. If that doesn't work, consider asking her for permission to sleep with other women. That option would satisfy me in your situation temporarily, but I'd have to think about whether it would satisfy me longer term.

Thanks. That looks like a really interesting body of work. This one on ethics is quite a fun read.

"Plod forever, but never believe you are going to get there."

-Sir Ranulph Fiennes

EDIT: I found this quote funny and strangely motivational, if you read it within the context. But looks like some people really dislike it.

If the master sat there listening to people's inane theories about how they need to punch differently than everybody else, or their insistence that they really need to understand a complete theory of combat, complete with statistical validation against a control group, before they can even raise a single fist in practice, that master would have failed their students AND their Art.

Even so, as a student, I do want the master to understand a complete theory of combat, complete with statistical validation against a control group.

What is your theory o Master?

Understanding something doesn't necessarily mean you can explain it. And explaining something doesn't necessarily mean anyone can understand it. Can you explain how to ride a bicycle? Can you learn to ride a bicycle using only an explanation? The theory of bicycle riding is not the practice of how to ride a bicycle. Someone else's understanding is not a substitute for your experience. That's my only "theory", and I find it works pretty well in "practice". ;-)
Sorry, I cannot. First, I'm not the owner. Second, the video includes a couple of fragments whose copyright status is, to put it mildly, questionable -- we had no access to stock footage back then, and we had one week to complete it, so we weren't picky about sources of the footage we used.

Hi, I'm Hrishi, 26, male. I work in air pollution modelling in London. I'm also doing a part-time PhD.

I am an atheist but come from a very religious family background.

When I was 15, I once cried uncontrollably and asked to see God. If there is indeed such a beautiful supreme being then why didn't my family want to meet Him? I was told that their faith was weak and only the greatest sages can see God after a lot of self-afflicted misery. So, I thought nevermind.

I've signed up for cryonics. You should too, or it'll just be 3 of us from LW when we wake up on... (read more)

I'm seriously thinking about asking my boss about that one. With a pro-rata decrease in salary, of course.

The extra money just doesn't seem to be worth the constant struggle with myself. Plus I think it would be good to start at a level I'm comfortable with and build on that. By forcing myself to work at a rate I'm clearly incapable of, I'm losing out on all the positive feedback that comes from small successes.

To draw a crude analogy, air pollution modelling is as hard a problem for me as say, AI is for EY. And if he needed to take every other day off on... (read more)

You would probably like Ferris's Four hour workweek, has an example of how to get your boss to let you work from home and stuff like that. Not the same as above, but similar enough to help you.

Genetic engineering aside, given a large aggregation of human beings, and a long time, you cannot reasonably expect rational thought to win. You could as reasonably expect a thousand unbiased dice, all tossed at once, all to come down 'five,' say. There are simply far too many ways, and easy ways, in which human thought can go wrong. Or, put it the other way round: anthropocentrism cannot lose.

That's the same argument against rationalist winning that has been seen many times on LW. However, it is based on hopelessness and fear, rather than on knowledge... (read more)

Hi, "first time, long time." :-> The way I read that, I thought he was talking about even larger, longer term societal structures. Like, imagine many generations of atheist eudaimonia that doesn't collapse on itself -- creating ridiculous new philosophy-religions, over generations. The author's future history seems to involve static human nature at play for a long, long time. Someone needs to give this guy a hug. Or, even better, a copy of "Engines of Creation". And, according to wikipedia, he died in 1994...

Thanks Alicorn. This sounds like a brilliant idea. I have been thinking of something along these lines but hadn't quite thought of day chunks - makes a lot of sense to me too.

I'll give it a try. And yes, I'll be careful.

We value rationality first and foremost because if you take the long view it wins and in the world we populate it wins.

You seem to be making an argument both for and against our cause in the same breath.

The reason irrationality "wins" for the "many people" you mention is that they re-define winning in hindsight when things don't work out.

We are challenging those social systems, which are unaccountable and only provide mysterious explanations when they fail. We aspire to build more robust systems. That's what I think winning is.

I imag... (read more)

Does it really matter if the definition of winning shifts, as long as you still experience the warm fuzzies? I think for some people it doesn't. Quoting Eliezer's OB post If satisfying your intuitions is more important to you than money, do whatever the heck you want. Drop the money over Niagara falls. Blow it all on expensive champagne. Set fire to your hair. Whatever. If the largest utility you care about is the utility of feeling good about your decision, then any decision that feels good is the right one. []

I just got off the phone with my mom.

Mom: You're working hard on your PhD, aren't you?

Me: Yes, ma there's lots to do. Oh and I put in a paper for a conference. If it gets accepted I'll go to America to present it.

Mom: Of course it will get accepted. You're working so hard, won't God listen to you?

Everything comes from God. Forget making amazing awe-inspiring monuments. Writing a paper on air pollution in London comes from Him. Getting to go to a conference comes from Him.

My mom can't truly appreciate what I do. Because fundamentally, at the gut level, she ... (read more)

Of course, if it doesn't get accepted, that too will be part of God's plan. Nevermind the fact that if God is the explanation for everything, it's really the explanation for nothing []
I notice also in this example, the focus on your hard work rather than the results of your work: producing a good PhD thesis, writing a paper that the conference values. It is as if your hard work would be just as valuable if it did not produce the results, and less valuable if you worked less hard to produce the same results. "Working hard" on your PhD seems about as useful as "trying" to flip a switch []. It seems that these issues are related; hard work is a virtue that God rewards, not the direct cause of good results.
Likewise, have you noticed that after someone successfully undergoes a difficult, risky, multi-hour surgical operation to handle some kind of medical emergency, it's far more correct to thank God for His mercy than it is to, say, credit modern medicine, or even the doctors who performed the operation?
Done. []

I'm confused by that example.

Let's say, by increased attractiveness you mean he started talking more attractively, then that is an outward change, but then the question is whether it was brought about by an inward change.

If the change happened without him thinking about it and only because of his surroundings as a seaman, - which is the point of your post - then it's surely not an inward change.

But if he changed upon reflection of his experience at sea and consciously changing his behaviour, then your robot analogy breaks.


I meant your second option. You're right, "inward/outward" was an inaccurate wording - English is my second language and I sometimes get carried away with the sound of words instead of their meaning, as is customary in Russian. Would've been better to say "change on the outside drives change on the inside". I will mull it over a bit and maybe change the title.

Wow that's amazing Vladimir, well done. The obvious next question is.... how did you do it? Please give an example of at least one of your tricks if possible.

Edit: I've made a top-level post [] for sharing anti-akrasia techniques -- go ahead and share your techniques as well. Let's continue the discussion there. A very quick outline (I'll post a detailed version later). * Determine what is your current better judgment. This is critical -- I noticed that I hesitate to trick myself into doing anything I don't consider to be relevant to my goal. * Constant asking myself: "is the activity I'm doing at the moment advancing me toward the desired state of reality"? If the answer is "no", know that I'm procrastinating. * 80/20 elimination, Tim Ferris / Pareto style (I'm skeptical about the rest of Tim's book, but the Elimination chapter is pure gold). * Parkinson's law (work expands to fill the time allotted). Again, Tim has some advice on it -- basically, it boils down to scheduling most important things (in the 80/20 sense) first, with aggressive deadlines. * PJ Eby's secret meaning of "just do it". He considers the article to be outdated, but its key paragraph worked wonders for me. Basically, "just do it" = "don't do anything else". In its pure form, "not doing anything else" is too macho for me, so I leave a line of retreat for myself -- I permit myself to eat, think about anything (not just the task), walk, have sex, but no surfing unless it's on-topic, no doodling on paper unless it's on-topic, etc. * Self-priming -- I try to expose myself to stimuli related to my current task, and to shield myself from irrelevant stimuli, no matter how pleasant (e.g. I run away from my toddler daughter, because prolonged exposure to cuteness tends to totally ruin my ability to work efficiently :) * Begin. For example, if you need to do some stuff in Excel, just open an empty spreadsheet and type in the table header. Just stupidly staring at this makes you better at spreadsheets and your task -- your mind pulls lin
Perhaps any further discussion would be better served by encouraging Vladimir to make a top-level post on the subject? This seems to be veering slightly off-topic.

there's no long-term benefit associated with its removal.

The first step in solving a problem is to recognise it. If I discovered I have cancer I would be demoralised immensely but I'd prefer that and take a shot at recovering rather than die unknowingly.

Denial is not a path to improvement.

I agree that denial usually seems like a bad idea, but the problem with things like stereotype threat [] is that they suggest (and more importantly provide evidence) that sometimes it might actually be useful (a path to improvement [], even if not necessarily the first-best path). The trick, presumably, is to distinguish the situations when this will hold from those when it doesn't.
Yes, but there is a long-term benefit associated with the removal of your cancer. On the other hand, if you had a blemish on your shoulder, you'd be better off not noticing it.
Strongly seconded. I speak from experience: when evidence starts mounting for some horrible, nightmarish proposition that you're scared of, it is tempting to tell yourself that even if it were true, it wouldn't really matter, that there would be no benefit to acknowledging it, that you can just go on acting as you've always done, as if nothing's changed. But on your honor as an aspiring rationalist, you must face the pain directly []. When you get a hint that this world is not what you thought it was, that you are not what you thought you were--look! And update!---no matter how much it hurts, no matter how much your heart may cry for the memory of the world you thought this was. Do it selfishly, in the name of the world you thought you knew: because once you have updated, once you see this hellish wasteland for what it really is, then you can start to try to patch what few things up that you can. Suppose you really don't like gender roles, and you're quietly worried about something you read about evolutionary psychology. Brushing it all under the rug won't help. Investigate, learn all you can, and then do something []. Maybe something drastic, maybe something trivial, but something. Experiment with hormones! Donate a twenty to GenderPAC []! Use your initials in your byline! But something, anything other than defaulting to ignorance and letting things take their natural course.

Perhaps the title should be 'Outward change obviates inward change'?

Good suggestion, but I'm not sure. Reportedly Arthur Conan Doyle radically increased his attractiveness to women after his stint as a seaman. You will get some inward change into the bargain too.

I didn't say it would be difficult for a religious person to come up with that idea. But if a religious person did come up with it, what does that have to do with their religion?

"Love your neighbor as yourself", perhaps?

there's a warm fuzziness to life that science just doesn't seem to get

Not true. Science helps create new warm fuzzies whereas religion has been re-using the same old one for millennia. The problem with religion is not that it lets people have warm fuzzies but that it provides false explanations.

For example, the building in Ireland that is discussed in the first BHTV episode: I imagine the warm fuzzies one gets on visiting that place are to do with the atmosphere that has been created, that rare experience of the sunlight breaking through carefully craf... (read more)

No, no, no, a thousand times no. Rationality is not how we experience the world; it's how we process our experience. I'm eating something tasty; rationality has nothing whatsoever to do with that immediate experience. I can apply rationality to that experience to figure out how to have more like it, or if a somewhat similar experience would be similar in enough ways to give me a pleasant experience. But if you put "rationality" into "way to experience the world" you get a category error.
I'm skeptical. Why do you think it would be difficult for a religious person to come up with the monument idea, for example?
His point seems to be that rationality isn't the only way to experience the world, which is absolutely, 100% right. But it's the one that wins. And people do want to win. I want to take issue with this Less Wrong mantra. It's just not true for many people, and you'll have a hard time winning them over if you can't empathize with that. We value rationality first and foremost because if you take the long view it wins and in the world we populate it wins. But for many people recklessness wins, or faith wins - for whatever reason, the social systems they have inherited and constructed for themselves contain constraints which favor nonrational behavior. Right. It's done through intelligence, that's why rats don't paint. Remember EY's intelligence scale? The distinction is not between village idiot and Einstein. It's between amoeba, chimps, humans and higher intelligences. What I'm basically getting at is that the tendency to emphasize the latter distinction can cause one to undervalue dissimilarity in the human social world.

I get up most easily when I've slept enough. If I get 8 hours of sleep, I don't even have to try getting up. I feel refreshed and am happy to get up. I'm not sure if the number of hours is 8, but from memory it seems to be around that much.

Does anyone else have the same experience?

Right now, no, but that's probably because I permanently have a sleep debt of who-knows-how-many-hours. If I have to get up early in the morning (which is usually), I'll aim to go to bed 8 hours before I have to get up, which is never enough because I don't fall asleep instantaneously. If I don't have to get up early, I'll usually stay up later (where "later" is midnight or so) and drag myself out of bed at some ridiculous-for-me hour of 9 or 9:30 am, because if I get up later than that I feel like I'm wasting my whole day. But there have been times in my life where getting up was easy EVERY day. Getting up at the same time every day helps, especially if that time is 7:30 or so...getting up at 5:00 always feels horrible, probably because it's really hard to reliably be in bed by 9 pm when most people my age are night owls and want to socialize in the evenings. Maybe I could adjust to a schedule where I slept 9 pm to 5 am, but it would mean seeing people less.
"I get up most easily when I've slept enough. . . Does anyone else have the same experience?" I am going to go out on a limb and say that most of us have that experience.

What do you mean by 'practice self-distraction'? Can you give an example?

If you're hungry but, for some reason, think you should refrain from eating, you can go do something sufficiently absorbing that you stop thinking about how hungry you are. Such as play video games.
Load More