Shit rationalists say - 2018

by ChristianKl1 min read20th Feb 201827 comments

32

HumorCommunity
Personal Blog

In 2012 we had a thread titled Shit rationalists say that lead to the fun video Shit rationalists say. Given that the video is a lot of fun to watch, how about starting a new list that's up-to-date.

Share whatever comes to mind and have fun :)

22 comments, sorted by Highlighting new comments since Today at 4:27 PM
New Comment

"I'm not saying that anyone isn't doing the best they can individually, I'm just saying that they collectively really ought to be able to find a better nash equilibrium."

"Oh, you're in <major field/institution>? I thought Moloch had already eaten all the value there."

"No, I was into AI x-risk before it was cool."

"No of course I'm not saving for retirement."

"Well, in science we have standards of what qualifies as evidence. In increasing order of respectability, we have personal opinion, expert opinion, case reports, cohort studies, RCTs, and meta-analyses.

And then if none of those work, we use what's known as an 'SSC lit review'."

"Maybe it would help if you moved to the Bay?"

"Never ever ever move to the Bay."

"You may have changed my betting odds, but you haven't changed my models!"

"Here's five dollars; I knew you would have made that bet with me had I offered it to you at the time, and it turns out I was wrong."

"Someone's clearly being modest! I won't believe you until you actually buy all the lightbulbs you can and prove me wrong."

"Logical Induction: a financial solution to the computer science problem of metamathematics."

"I don't read the comments on SSC per se. I just use ctrl-F and then read all the threads that Scott has replied to."

"In reality, humans never recurse more than 3 times. This is called the 'Pants Principle', based off an exception to the rule where Eliezer's pants were supposed to go in the washing but ended up in Washington DC."

"Did you know that all statistical distributions are actually power laws?"

"There are only two books: The Sequences, and the collected essays of Paul Graham. All else is stamp-collecting."

"None of this is a coincidence, because nothing is ever a coincidence."

"Taboo tradeoffs? Sure: I can have different quantities of a various things, and I need to pick the quantities of each thing / allocate resources among them in the way maximises my preferences given the various costs.

"I live with rationalists, and all my friends are rationalists. That's right, I consider myself 'rationalist-adjacent'."

"Have you considered just reading the Feynman Lectures on Physics?"

"I HAVE NOT WORKED OUT ALL OF THE BUGS IN THE PART_SEA FUNCTION."

"I think we need to make the concept of common knowledge common knowledge."

"I think my organisation is coming down with a case of cost-disease."

"I call 'beliefs' the things that generate my predictions. I call 'reality' the things that determines the outcomes."

"The control group is out of control!"

"All of my terminal goals are actually about aesthetics and art, but I still care about AI because avoiding an existential catastrophe is the convergent instrumental goal of many goal systems - including mine."

FYI, the pants ended up in Washington DC (I think this is important because they were supposed to end up in the washing machine)

"Here's five dollars; I knew you would have made that bet with me had I offered it to you at the time, and it turns out I was wrong."

It me.

Rationalist dating:

"I know this might be a weird analogy, but relationships are like LessWrong"

"Should I date Jaan Tallinn? I'm not attracted to him at all, but it seems high-EV."

"You're so cute I just want to invest all my resources in you!"

"I know we should go to sleep, but have you considered hyperbolic discounting?"

"Now that you've witnessed my inability to apply instrumental rationality to the problem of going to sleep on time you have to break up with me"

"I don't have to take care of myself because in twenty years we won't even have human bodies anymore!"

"Wait don't go, you're instrumentally useful to me!"

"I can't rationally play card games. Whatever hand I'm dealt, the prior odds of getting that hand are so low that my fallible eyes cannot possibly provide enough log-odds to raise it even to 50-50."

"I should make a TAP for that."

"It's not evidence, but it's Bayesian evidence."

"Every statement is true in a much larger proportion of all universes in which it is said than of those in which it is not said, therefore hearing a thing said is Bayesian evidence that it is true. We should have sex."

'I'm curious why..."

<Wiggles fingers in agreement>

"...well, we have QALYs. They're kinda questionable, but at least they're numbers."

Wait, is the finger wiggle for agreement a thing that rationalists do too? I picked this up from the authentic relating community in Austin and didn't realize it was more widespread.

It's based on American sign language for applause (http://www.lifeprint.com/asl101/pages-signs/a/applause.htm).

I have heard from another rationalist that it was popularized by the occupy movement given that they had a lot of need for managing discussions with large amount of people.

Various communities have adopted it afterwards, because it seems good social tech.

CFAR instructors introduced it a few years ago, I think. Not sure where they got it from but it may have been authentic relating.

I've encountered it in another personal development type thing, where it was said that it came from activists holding clandestine meetings in South Africa in the days of apartheid, as a silent substitute for applause.

Honestly I just watched the video for the first time and I don't think it needs an update; it's still pretty painfully accurate :P

"That's a completely crap idea."

"But shouldn't we try it out, to find out if it is?"

"Humans have only lived like that for 100,000 years or so. In the Environment of Evolutionary Adaptedness..."

"Wanna expand your comfort zone with me?"

"It is difficult to live in full awareness of the fact that every choice is a choice about the entire future history of all possible universes in which there is an entity computationally similar to me, but I must try to maximise the measure of my self-analogues that do."

"My system I is not very compelled by bio-risk. Possibly, I just have to train imaging everybody dying of aging or something."