All of markrkrebs's Comments + Replies

Consequentialism Need Not Be Nearsighted

Good point.. Easy to imagine a lot of biologically good designs getting left unexpressed because the first move is less optimal.

Consequentialism Need Not Be Nearsighted

Hmm, I agree, except for the last part. Blindly trying (what genetic mixing & mutating does) it like poorly guided forecasting. (Good simulation engineers or chess players somehow "see" the space of likely moves, bad ones just try a lot) and the species doesn't select, but the environment does.

I need to go read "evolve to extinction."

Thanks

Consequentialism Need Not Be Nearsighted

The world we find ourselves in would never expect the doctor to cut the guy up. Few people are doing that consequentialist math. Well, maybe a few long thinkers on this site. So, the supposed long view as reason for not doing it is baloney. I think on that basis alone the experiment fails to come up recommending the conventional behavior it's trying to rationalize.

1Desrtopa11y
We would never expect the doctors to cut the guy up, but hardly any doctors would cut the guy up. Doctors are drawn from the same pool as the rest of society, so society's expectations of their behavior are pretty much on point. In a world where doctors were likely to cut the person up, the public would also be a lot more likely to expect doctors to cut the person up.
Consequentialism Need Not Be Nearsighted

Well, they could EVOLVE that reticence for perfectly good reasons. I'll dare in this context to suggest that evolution IS intelligence. Have you heard of thought as an act of simulating action and forecasting the results? Is that not what evolution does, only the simulations are real, and the best chess moves "selected?"

a species thereby exhibits meta-intelligence, no?

5JoshuaZ11y
I'm not completely sure what you are trying to say. I agree they could potentially evolve such an attitude if the selection pressure was high enough. But evolution doesn't work like a chess player. Evolution does what works in the short term, blindly having the most successful alleles push forward to the next generation. If there were a chess analogy, evolution would be like a massive chess board with millions of players and each player making whatever move looks best at a quick glance, and then there are a few hundred thousand players who just move randomly.
8wedrifid11y
That's a waste of a word. Call evolution an optimisation process (which is only a slight stretch). Then you can use the word 'intelligence' to refer to what you refer to as 'meta-intelligence'. Keeping distinct concepts distinct while also acknowledging the relationship tends to be the best policy. No, it really isn't and using that model encourages bad predictions about the evolution of a species. Species don't 'forecast and select'. Species evolve to extinction [http://lesswrong.com/lw/l5/evolving_to_extinction/] with as much enthusiasm as they evolve to new heights of adaptive performance. Saying that evolution 'learns from the past' would be slightly less of an error but I wouldn't even go there.
Consequentialism Need Not Be Nearsighted

"philosophy tries... to agree with our ...intuition..."? Bravo! See, I think that's crazy. Or if it's right, it means we're stipulating the intuition in the first place. Surely that's wrong? Or at least, we can look back in time to see "obvious" moral postulates we no longer agree with. In science we come up with a theory and then test it in the wind tunnel or something. In philosophy, is our reference standard kilogram just an intuition? That's unsatisfying!

Consequentialism Need Not Be Nearsighted

I had fun with friends recently considering the trolley problem from a perspective of INaction. When it was an act of volition, even (say) just a warning shout, they (we) felt less compelled to let the fat man live. (He was already on the track and would have to be warned off, get it?) It seems we are responsible for what we do, not so much for what we elect NOT to do. Since the consequences are the same, it seems wrong that there is a perceptive difference. This highlights, I suppose the author's presumed contention (consequentialism generally) that the... (read more)

Consequentialism Need Not Be Nearsighted

Your doctor with 5 organs strikes me as Vizzini's princess bride dilemma, "I am not a great fool, so I can clearly not choose the wine in front of you."

So it goes, calculating I know you know I know unto silliness. Consequentialists I've recently heard lecturing went to great lengths, as you did, to rationalize what they 'knew" to be right. Can you deny it? The GOAL of the example was to show that "right thinking" consequentialists would come up with the same thing all our reptile brains are telling us to do.

When you throw a ball... (read more)

3Desrtopa11y
People's moral intuitions are incoherent, and will tend to return different answers for the same dilemma phrased in different terms [http://lesswrong.com/lw/n3/circular_altruism/]. Our evolved heuristics have their uses, among them is not turning into a social pariah in a group that relies on the same heuristics, but they're certainly not isomorphic to strict consequentialism.
Belief in Belief

Surprised not to find Pascal's wager linked to this discussion since he faced the same crisis of belief. It's well known he chose to believe because of the enormous (inf?) rewards if that turned out to be right, so he was arguably hedging his bets.

It's less well known that he understood it (coerced belief for expediency's sake) to be something that would be obvious to omniscient God, so it wasn't enough to choose to believe, but rather he actually Had To. To this end he hoped that practice would make perfect and I think died worrying about it. this is d... (read more)

Selfishness Signals Status

For reasons I perhaps don't fully understand this, and threads like it are unsettling to me. Doesn't high status confer the ability (and possibly duty, in some contexts) to treat others better, to carry their pack so to speak? Further, acting high status isn't necessary at all if you actually have it (it being the underlying competence status (supposedly, ideally) signifies. I am a high status athlete (in a tiny, circumscribed world) and in social situations try to signify humility, so others won't feel bad. They can't keep up, and if made to feel so, ... (read more)

8wedrifid12y
Keep in mind that describing something is not the same as approving or advocating. I would like it if status was used in the way that you describe. Sometimes it can be, and is, used that way. But it is a mistake to assume that status is supposed to be a measure of competence and an even greater mistake to expect it to be correlated (positively) with treating others better. It just isn't. It's a silly game, but it is one that is played right down to the very core of our instincts. And it is right through to the core of even those 'close relationships' you mention. You can't avoid the game. Just find the parts of it you like or are of particular use to you and satisfice the rest.
The Graviton as Aether

Most excellent. Now, glasshoppah, you are ready to lift the bowl of very hot red coals. Try this

A Fable of Science and Politics

Nice example of Bliks in action. Literature is powered by such dramas, where people's individual mindset shifts the spectrum of every photon right or left of the reader, or the other protagonists, and the tragedy is that too few rays of light fall true, through a clear eye.

Ferris I suppose has seceded, too advanced to bother with the various foolish repercussions she knows will ring through the world under her feet from this new data. That's fine, she's too far ahead to go back anyway. ()

I worry that we (denizens of this website) are too confident that OU... (read more)

2MoreOn11y
If people here are wrong, but you care enough to read, you owe it to them and to yourself to examine their arguments critically.
9RobinZ12y
Not "the" - "a". Being too confident is liable to get you into quite a lot of trouble ... but so is being underconfident [http://lesswrong.com/lw/c3/the_sin_of_underconfidence/]. More important than garbing yourself in properly humble fabrics is actually paying attention when your beliefs are contradicted, and updating your beliefs accordingly. I can be confident that, say, the Ford Taurus was a rubbish car, and change my mind when I discover the first and second generation Tauruses were widely admired [http://en.wikipedia.org/wiki/Ford_Taurus]. My confidence therefore costs me little in this instance.
The Graviton as Aether

You correctly decry popularity as a non-rational measure of veracity, but to the extent that it expresses a sort of straw poll, it may be a good indicator anyway. The idea of expert futures markets comes to mind.

My point is related: is it not also a fallacy to assert it's GOT to be simple? That's awful close to demanding (even believing?) something's true because it ought to be, because we want it so bad. Occam's razor has worked like a champ all these years but inference is risky and maybe now, we find ourselves confronted with some hard digging. I too hope some crystalline simplification will make everything make sense, but I don't think we've a right to expect that, or should. What you and I want doesn't matter.

Open Thread: March 2010

I suggest you pay me $50 for each week you don't get and hold a job. Else, avoid paying me by getting one, and save yourself 6mo x 4wk/mo x $50 -$100 = $400! Wooo! What a deal for us both, eh?

3MixedNuts12y
That's an amusing idea, but disincentives don't work well, and paying money is too Far a disincentive to work (now, if you followed me around and punched me, that might do the trick). This reminds me of the joke about a beggar who asks Rothschild for money. Rothschild thinks and says "A janitor is retiring next week, you can have their job and I'll double the pay.", and the beggar replies "Don't bother, I have a cousin who can do it for the original wage, just give me the difference!"
Great Product. Lousy Marketing.

Saying there are white arts as well as dark ones is conceding the point, isn't it? One should be allowed to be persuasive as well as right, and sometimes just being right isn't enough, especially if the audience is judging the surface appeal of an argument (and maybe even accepting it or not!) prior to digging into it's meat. In such situations, attractive wrapping isn't just pretty, it's a prerequisite. So, I love your idea of inventing a protocol for DAtDA.

Open Thread: March 2010

The neurology of human brains and the architecture of modern control systems are remarkably similar, with layers of feedback, and adaptive modelling of the problem space, in addition to the usual dogged iron filing approach to goal seeking. I have worked on a control systems which, as they add (even minor) complexity at higher layers of abstraction, take on eerie behaviors that seem intelligent, within their own small fields of expertise. I don't personally think we'll find anything different or ineffable or more, when we finally understand intelligence, ... (read more)

0Karl_Smith12y
I had conceived of something like the Turing test but for intelligence period, not just general intelligence. I wonder if general intelligence is about the domains under which a control system can perform. I also wonder whether "minds" is a too limiting criteria for the goals of FAI. Perhaps the goal could be stated as a IUCS. However, we dont know how to build ICUS. So perhaps we can build a control system whose reference point is IUCS. But we don't know that so we build a control system whose reference point is a control system whose reference point . . . until we get to some that we can build. Then we press start. Maybe this is a more general formulation?
Great Product. Lousy Marketing.

Jonathan, I'll try again, with less flair...
My comments were to the original post, which asks if "dark arts" are justified and I say simply, yes. I think lots of otherwise very smart people who like to communicate with bare logic and none of the cultural niceties of linguistic foreplay can actually alienate the people they hope to persuade. You may have just done that, assuming you were trying to persuade me of something.

Re: losing the audience that demands respect, firstly I was trying to be illustrative in a funny, not disrespectful way, and ... (read more)

Great Product. Lousy Marketing.

I guess I'm for persuasion, think the ends justify in this case. Otherwise you're all bringing knives to a gunfight and crying foul when you get shot down. Could there be a degree of "sour grapes" here resulting from finding one's self inexplicably unpersuasive next to a velvet tongued dummy? Are we sure we eschew the tactics of rhetoric because they're wrong? Is it even fair to describe "dark arts" as wrong?

I say rather that speech is meant to be persuasive. Better speech would then be more persuasive. As such persuasion backed by t... (read more)

2wedrifid12y
I tend to be quite persuasive. I'm sure I'll find something else to be resentful about. 'Wrong'? Describing things as 'wrong' is a dark art.
3Jonathan_Graehl12y
What's the meta-point being made by your obnoxious metaphor-laden text? What's the reason for your abuse of rhetorical questions? It's not effective communication. You demonstrate just how when you're too heavy on simpleton-wowing flash, you risk losing the part of the audience that demands respect. Here's my gloss of your three paragraphs: 1) "Losers always whine about their best" [http://www.youtube.com/watch?v=ZvzIw7fyA2k] 2) If you're right, and you don't persuade, then your speech wasn't good enough. 3) It's fine to design your speech so as to mislead the dumb and the inattentive toward your side. To the last I'd add the obvious caveat that you should consider how much of your audience you lose if you do it gracelessly.
Welcome to Less Wrong!

Hi! Vectored here by Robin who's thankfully trolling for new chumps and recommending initial items to read. I note the Wiki would be an awesome place for some help, and may attempt to put up a page there: NoobDiscoveringLessWrongLeavesBreadcrumbs, or something like that.

My immediate interest is argument: how can we disagree? 1+1=2. Can't that be extrapolated to many things. I have been so happy to see a non-cocky (if prideful) attitude in the first several posts that I have great hopes for what I may learn here. We have to remember ignorance is an imp... (read more)

2wedrifid12y
Aaahhh. Now I see. RobinZ. I usually read 'Robin' as Robin Hanson from Overcoming Bias [http://www.overcomingbias.com], the 'sister site' from the sidebar. That made me all sorts of confused when I saw that you first found us [http://lesswrong.com/lw/j8/the_crackpot_offer/1ohj] when you were talking to a biased crackpot. Anyway, welcome to Lesswrong.com. Let's see: * One of us is stupid. * One of us doesn't respect the other (thinks they are stupid). * One of us is lying (or withholding or otherwise distorting the evidence). * One of doesn't trust the other (thinks they aren't being candid with evidence so cannot update on all that they say). * One of us doesn't understand the other. * The disagreement is not about facts (ie. normative judgments and political utterances). * The process of disagreement is not about optimally seeking facts (ie. it is a ritualized social battle.) Some combination of the above usually applies, where obviously I mean "at least one of us" in all cases. Of course, each of those dot points can be broken down into far more detail. There are dozens of posts here describing how "one of use could be stupid". In fact, you could also replace the final bullet point with the entire Overcoming Bias [http://www.overcomingbias.com] blog.
Rational Me or We?

I love your thesis and metaphor, that the goal is for us all jointly to become rational, seek, and find truth. But I do not "respect the opinions of enough others." I have political/scientific disagreements so deep and frequent that I frequently just hide them and. worry. I resonated best with your penultimate sentence: "humanity's vast stunning cluelessness" does seem to be the problem. Has someone written on the consequences of taking over the world? The human genome, presumptively adapted to forward it's best interests in a co... (read more)

4wedrifid12y
Unusual threat by a rival tribe. Retaliation necessary. Excuse to take politically self serving moves by surfing a tide of patriotic sentiment. That sort of thing. What you would expect monkeys to care about.
Reason as memetic immune disorder

"The conservatism of a religion - it's orthodoxy - is the inert coagulum of a once highly reactive sap." -Eric Hoffer, the True Believer

Love your post: religion as virulent namb-shub. See also Snow Crash by Stephenson.

8RobinZ12y
Quick tip: HTML doesn't work in the comments, but you can make italics by putting asterisks (*) around the thing to be italicized. There should be a "Help" link below the comment window that will unfold a list of markups.
The Crackpot Offer

I love this site. Found it when looking at a piece of crackpot science on the internet and, wondering, typed "crackpot" into google. I am trying to argue with someone who's my nemesis in most every way, and I'm trying to do it honestly. I feel his vested interest in the preferred answer vastly biases his judgment & wonder what biases do I have, and how did they get there. You seem to address a key one I liken to tree roots, growing in deep and steadfast wherever you first happen to fall, whether it's good ground or not.

Not unlike that analogy, I landed here first, on your post, and found it very good ground indeed.

Wecome to LessWrong!

If you want another couple threads to start exploring, one very good starting place is What Do We Mean By Rationality? and its links; then there is the massive collection of posts accumulated in the Sequences which you can pick over for interesting nuggets. A lot of posts (and comments!) will have links back to related material, both at the top and throughout the text.