Given our known problems with actively expressing approval for things, I'd like to mention that I approve of the more frequent open threads.
While reading a psychology paper, I ran into the following comment:
Unfamiliar things are distrusted and hard to process, overly familiar things are boring, and the perfect object of beauty lies somewhere in between (Sluckin, Hargreaves, & Colman, 1983). The familiar comes as standard equipment in every empirical paper: scientific report structure, well-known statistical techniques, established methods. In fact, the form of a research article is so standardized that it is in danger of becoming deathly dull. So the burden is on the author to provide content and ideas that will knock the reader’s socks off—at least if the reader is one of the dozen or so potential reviewers in that sub-subspecialty.
Besides the obvious connection to Schmidhuber's esthetics, it occurred to me that this has considerable relevance to LW/OB. Hanson in the past has counseled contrarians like us to pick our battles and conform in most ways while not conforming in a few carefully chosen ones (eg Dear Young Eccentric, Against Free Thinkers, Even When Contrarians Win, They Lose); this struck me as obviously correct, and that one could think of oneself as having a "budget" where non-conforming on both dr...
Awesome job, whoever made this "latest open thread," "latest rationality diary," and "latest rationality quote" thing happen!
One of the most salient differences between groups that succeed and groups that fail is the group members' ability to work well with one another.
A corollary: If you want a group to fail, undermine its members' ability to work with each other. This was observed and practiced by intelligence agencies in Turing's day, and well before then.
Better yet: Get them to undermine it themselves.
By using the zero-sum conversion trick, we can ask ourselves: What ideas do I possess that the Devil¹ approves of me possessing because they undermine my ability to accomplish my goals?
¹ "The Devil" is shorthand for a purely notional opponent whose values are the opposite of mine.
One Devil's tool against cooperation is reminding people that cooperation is cultish, and if they cooperate, they are sheep.
But there is a big exception! If you work for a corporation, then you are expected to be a team player, and you have to participate in various team-building activities, which are like cult activities, just a bit less effective. You are expected to be a sheep, if you are asked to be one, and to enjoy it. -- It's just somehow wrong to use the same winning strategy outside the corporation, for yourself or your friends.
So we get the interesting result that most people are willing to cooperate if it is for someone else's benefit, but have an aversion against cooperation for their own. If I tried to brainwash people to become obedient masses, I would be proud to achieve this.
This said, I am not sure what exactly caused this. It could be a natural result of thousand small-scale interactions; people winning locally by undermining their nearest competitors' agency, and losing globally by poluting the common meme-space. And the people who overcome this and become able to optimize for their own benefit probably find it much easier to find themselves followers than peers; thus they get out of the system, but don't change the system.
Me and my friend are organizing a new meetup in Zagreb but I don't have enough karma to make an announcement here. Thanks!
[Meta] Most meetup threads have no comments. It seems like it would be useful for people to post to say "I'm coming", both for the organiser and for other people to judge the size of the group. Would this be a good social norm to cultivate? I worry slightly that it would annoy people who follow the recent comments feed, but I can't offhand think of other downsides.
Suggested alternative to reduce the recent comment clutter issue: Have a poll attached to each meetup with people saying if they are coming. Then people can get a quick glance at how many people are probably coming, and if one wants to specifically note it (say one isn't a regular) then mention that in the comment thread.
Here is some verse about steelmanning I wrote to the tune of Keelhauled. Compliments, complaints, and improvements are welcome.
*dun-dun-dun-dun
Steelman that shoddy argument
Mend its faults so they can't be seen
Help that bastard make more sense
A reformulation to see what they mean
Being in Seattle has taught me something I never would have thought of otherwise:
Working in a room with a magnificent view has a positive effect on my productivity.
Is this true for other people, as well? I normally favor ground-level apartments and small villages, but if the multiplier is as consistent as it's been this past week, I may have to rethink my long-term plans.
It could be just the novelty of such a view. I suspect that any interesting modification to your working environment leads to a short-term productivity boost, but these things don't necessarily persist in the long term. In any case, it seems like the VoI of exploring different working environments is high.
Question: Who coined the term "steelman" or "steelmanning", and when?
I was surprised not to find it in the wiki, but the term is gaining currency outside LessWrong.
Also, I'd be surprised if the concept were new. Are there past names for it? Principle of charity is pretty close, but not as extreme.
Google search with a date restriction and a few other tricks to filter out late comments on earlier blog posts suggests Luke's post Better disagreement as the first online reference, though the first widely linked reference is quite recent, from the Well Spent Journey blog.
Saw this on twitter. Hilarious: "Ballad of Big Yud"
There is another video from the same author explaining his opinions on LW. It takes 2 minutes to just start talking about LW, so here are the important parts: ---
The Sequences are hundreds and hundreds of blog posts, written by one man. They are like catechism, teach strange vocabulary like "winning", "paying rent", "mindkilling", "being Bayesian".
The claim that Bayes theorem, which is just a footnote in statistic textbook, has the power to reshape your thinking so that you can maximize the outcomes of your life... has no evidence. You can't simplify the complexity of life into simple probabilities. EY is a high-school dropout and he has no peer-reviewed articles.
People on LW say that criticism of LW is upvoted. Actually, that "criticism" does not disagree with anything -- it just asks MIRI to be more specific. Is that the LW's best defense against accusations of cultishness?
LW community believes in Singularity, which again, has no evidence, and the scientific community does not support it. MIRI asks your money, and does not say how specifically it will be used to save the world.
LW claims that politics is the mindkiller, yet EY adm...
There's a user at RationalWiki, one of the dedicated LW critics there, called "Baloney Detection". I often wondered who it was. The image at 5:45 in this video, and the fact that "Baloney Detection" also edited the "Julia Galef" page at RW to decry her association with LW, tells me this is him...
By the way, the RW article about LW now seems more... rational... than the last time I checked. (Possibly because our hordes of cultists sposored by the right-wing extremist conspiracy fixed it, hoping to receive the promised 3^^^3 robotic virgins in singularitarian paradise as a reward.) You can't say the same thing about the talk pages, though.
It's strange. Now I should probably update towards "a criticism of LW found online probably somehow comes from two or three people on RW". On their talk pages, Aris Katsaris sounds like a lonely sane voice in a desert of... I guess it's supposed to be a "rationality with a snarky point of view"; which works like this -- I can say anything, and if you prove me lying, I say I was exaggerating to make it more funny.
Some interesting bits from the (mostly boring) talk page:
Yudkowsky is an uneducated idiot because there simply can't be 3^^^3 distinct people
A proper skeptical argument about why "Torture vs Dust Specks" is wrong.
...what happened is that they hired Luke Muehlhauser who doesn't know about anything technical but can adequately/objectively research what a research organization would look like, and then
I agree, but we are speaking about approximately 13 downvotes from 265 total votes. So we have at least 13 people on LessWrong who oppose a high-quality criticism.
Or there are approximately 13 people who believe the post is worth a mere 250 votes, not 265 and so used their vote to push it in the desired direction. Votes needn't be made or considered to be made independently of each other.
In 2011, he describes himself as a "a very small-‘l’ libertarian” in this essay at Cato Unbound.
Could I get some career advice?
I'd like to work in software. I can graduate next year with a math degree and look for work, or I can study for additional CS-specific credentials, (two or three extra years for a Master's degree).
On the one hand, I'm told online that programming is unusually meritocratic, and that formal education and credentials matter very little if you can learn and demonstrate competency in other ways, like writing your own software or contributing to open-source projects.
On the other hand, mid-career professionals in other fields (mostly engineering), have told me that education credentials are an inevitable filter for raises, hiring, layoffs, and just getting interesting work. They say that getting a graduate degree will be worthwhile even if I could have learned equally valuable skills by other means.
I think I would enjoy and do well in graduate school, but if it makes little career difference, I don't think I would go. I'm skeptical that marginal credentials are unimportant, (or will remain unimportant in ten years), but I don't know any programmers in person who I could ask.
Any thoughts or experiences here?
I've recently noticed a new variant of failure mode in political discussions. It seems to be most common on political discussions where one already has almost all Blues or all Greens. It goes like this:
Blue 1: "Hey look at this silly thing said by random silly Green. See this website here."
Blue 2, Blue 3... up to Blue n: "Haha! What evil idiots."
Blue n+1 (or possibly Blue sympathizer or outright interloper or maybe even a Red or a Yellow): "Um, the initial link given by Blue 1 is a parody. That website does satire."
Large subset of Blue 2 through Blue n: "Wow, the fact that we can't tell that's a parody shows how ridiculous the Greens are."
Now at this point, the actual failure of rationality happened with Blues not Greens. But somehow Blues will then count this as further evidence against Greens. Is there any way to politely get Blues to understand the failure mode that has occurred in this context?
What with the popularity of rationalist!fanfiction, I feel like there's an irresistible opportunity for anyone familar with The Animorphs books.
Imagine it! A book series where sentient slugs control people's bodies, yet can communicate with their hosts. To borrow from the AI Box experiments, the Yeerks are the Gatekeepers, and the Controlled humans are the AI's! One could use the resident black-sheep character David Hunting as the rationalist! character, who was introduced in the middle of the series, removed three books later and didn't really do anything important. I couldn't write such a thing, but it would be wicked if someone else did.
I've run into a roadblock on the Less Wrong Study Hall reprogramming project. I've been writing against Google Hangouts, but it seems that there's no way to have a permanent, public hangout URL that also runs a specified application. (that is, I can get a fixed URL, or a hangout that runs an app for all users, but I can't do both)
Any of the programmers here know a way around that? At the moment it's looking like I'll have to go back to square zero and find an entirely different approach.
What are good sources for "rational" (or at least not actively harmful) advice on relationships?
What are good sources for "rational" (or at least not actively harmful) advice on relationships?
What sort of relationships? Business? Romantic? Domestic? Shared hobby?
The undercurrent that runs along good advice for most is "make your presence a pleasant influence in the other person's life." (This is good advice for only some business relationships.)
Athol's advice is useful, he does excellent work advising couples with very poor marriages. So far I have not encountered anything that is more unethical than any mainstream relationship advice. Indeed I think it less toxic than mainstream relationship advice.
As to misogyny, this is a bit awkward, I actually cite him as an example of a very much not woman hating red pill blogger. Call Roissy a misogynist, I will nod. Call Athol one and I will downgrade how bad misogyny is.
I disagree that his outlook is toxic. He uses a realistic model of the people involved and recommends advice that would achieve what you want under that model. He repeatedly states that it is a mistake to make negative moral judgement of your partner just because they are predictable in certain ways. His advice is never about manipulation, instead being win-win improvements that your partner would also endorse if they were aware of all the details, and he suggests that they should be made aware of such details.
I see nothing to be outraged about, except that things didn't turn out to actually be how we previously imagined it. In any case, that's not his fault, and he does an admirable job of recommending ethical relationship advice in a world where people are actually physical machines that react in predictable ways to stimuli.
Seriously, would you enjoy playing the part of a cynical, paranoid control freak with a person whom you want to be your life partner?
Drop the adjectives. I strive to be self-aware, and to act in the way that works best (in the sense of happiness, satisfaction, and all the other things we care about) for me and my wife, given my best model of the situation....
Serious damage to who? Idiots who fail to adopt his advice because he calls it a name that is associated with other (even good) ideas that other idiots happen to be attracted to? That's a tragedy, of course, but it hardly seems pressing.
Seems to me that people should be able to judge ideas on their quality, not on which "team" is tangentially associated with them. Maybe that's asking too much, though, and writers should just assume the readers are morally retarded, like you suggest.
I'm somewhat familiar. My impression is that the steelman version of it is a blanket label for views that reject the controversial empirical and philosophical claims of the left-wing mainstream:
Pointing out that an idea has stupid people who believe it is not really a good argument against that idea. Hitler was a vegetarian and a eugenicist, but those ideas are still OK.
It selects for these attitudes in its adherents
So?
Here's why that's true: "Red Pill" covers empirical revisionism of mainstream leftism. What kind of people do you expect to be attracted to such a label without considering which ideas are correct? I would expect bitter social outcasts, people who fail to ideologically conform, a few unapologetic intellectuals, and people who reject leftism for other reasons.
Then how are those people going to appear to someone who is "blue pilled" (ie reasonable mainstream progressive) for lack of a better word? They are going t...
I've been reading a lot of red pill stuff lately (while currently remaining agnostic), and my impression is that most of the prominent "red pill" writers are in fact really nasty. They seem to revel in how offensive their beliefs are to the general public and crank it up to eleven just to cause a reaction. Roissy is an obvious example. About one third of his posts don't even have any point, they're just him ranting about how much he hates fat women. Moldbug bafflingly decides to call black people "Negroes" (while offering some weird historical justification for doing so). Regardless of the actual truth of the red pill movement's literal beliefs, I think they bring most of their misanthropic, hateful reputation on themselves.
I haven't read Athol Kay, so I don't know what his deal is.
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.