Posts

Sorted by New

Wiki Contributions

Comments

Good point.. Easy to imagine a lot of biologically good designs getting left unexpressed because the first move is less optimal.

Hmm, I agree, except for the last part. Blindly trying (what genetic mixing & mutating does) it like poorly guided forecasting. (Good simulation engineers or chess players somehow "see" the space of likely moves, bad ones just try a lot) and the species doesn't select, but the environment does.

I need to go read "evolve to extinction."

Thanks

The world we find ourselves in would never expect the doctor to cut the guy up. Few people are doing that consequentialist math. Well, maybe a few long thinkers on this site. So, the supposed long view as reason for not doing it is baloney. I think on that basis alone the experiment fails to come up recommending the conventional behavior it's trying to rationalize.

Well, they could EVOLVE that reticence for perfectly good reasons. I'll dare in this context to suggest that evolution IS intelligence. Have you heard of thought as an act of simulating action and forecasting the results? Is that not what evolution does, only the simulations are real, and the best chess moves "selected?"

a species thereby exhibits meta-intelligence, no?

"philosophy tries... to agree with our ...intuition..."? Bravo! See, I think that's crazy. Or if it's right, it means we're stipulating the intuition in the first place. Surely that's wrong? Or at least, we can look back in time to see "obvious" moral postulates we no longer agree with. In science we come up with a theory and then test it in the wind tunnel or something. In philosophy, is our reference standard kilogram just an intuition? That's unsatisfying!

I had fun with friends recently considering the trolley problem from a perspective of INaction. When it was an act of volition, even (say) just a warning shout, they (we) felt less compelled to let the fat man live. (He was already on the track and would have to be warned off, get it?) It seems we are responsible for what we do, not so much for what we elect NOT to do. Since the consequences are the same, it seems wrong that there is a perceptive difference. This highlights, I suppose the author's presumed contention (consequentialism generally) that the correct ethical choice is obviously one of carefully (perhaps expensively!) calculated long term outcomes and equal to what feels right only coincidentally. I think in the limit, we would (consequentialists all) just walk into the hospital and ask for vivisection, since we'd save 5 lives. The reason I don't isn't JUST altruism, because I wouldn't ask you to either, instead it's a step closer to Kant's absolutism: as humans we're worth something more than ants (who I submit are all consequentialists?) and have individual value. I need to work on expressing this better...

Your doctor with 5 organs strikes me as Vizzini's princess bride dilemma, "I am not a great fool, so I can clearly not choose the wine in front of you."

So it goes, calculating I know you know I know unto silliness. Consequentialists I've recently heard lecturing went to great lengths, as you did, to rationalize what they 'knew" to be right. Can you deny it? The GOAL of the example was to show that "right thinking" consequentialists would come up with the same thing all our reptile brains are telling us to do.

When you throw a ball, your cerebral cortex doesn't do sums to figure where it will land. Primitive analog calculation does it fast and with reasonable accuracy. As we all know, doctors across the nation don't do your TDL sums either. Nor do I think they're internalized the results unconsciously either. They have an explicit moral code which in it's simple statements would disagree.

The thing I find interesting, the challenge I'd like to suggest, is whether consequentialism is somewhat bankrupt in that it is bending over backwards to "prove" things we all seem to know, instead of daring to prove something less obvious (or perhaps unknown / controversial). If you can make a NEW moral statement, and argue to make it stick, well that's like finding a new particle of matter or something: quite valuable.

Surprised not to find Pascal's wager linked to this discussion since he faced the same crisis of belief. It's well known he chose to believe because of the enormous (inf?) rewards if that turned out to be right, so he was arguably hedging his bets.

It's less well known that he understood it (coerced belief for expediency's sake) to be something that would be obvious to omniscient God, so it wasn't enough to choose to believe, but rather he actually Had To. To this end he hoped that practice would make perfect and I think died worrying about it. this is described in the Wikipedia article in an evasive third person, but a philosophy podcast I heard attributed the dilemma of insincere belief to Pascal directly.

Fun stuff.

For reasons I perhaps don't fully understand this, and threads like it are unsettling to me. Doesn't high status confer the ability (and possibly duty, in some contexts) to treat others better, to carry their pack so to speak? Further, acting high status isn't necessary at all if you actually have it (it being the underlying competence status (supposedly, ideally) signifies. I am a high status athlete (in a tiny, circumscribed world) and in social situations try to signify humility, so others won't feel bad. They can't keep up, and if made to feel so, will not want to come again. Maybe in this forum we just want to drop anyone who can't keep the pace. If I see someone acting supercilious/indifferent, signaling status on all frequencies, I will infer you have something to hide, or strong feelings of incompetence that need to be stroked. Now we can play the game of you know I know high status signallers may be compensating, but it's a silly game, because faking status, if that's what you want, is only a temporary fiction. Any close relationship will soon scrape through that whitewash. Unfortunately, (I think) poseurs do manage to get by quite well in the world, by exactly the techniques being discussed here. Maybe everybody should get a tattoo with their VO2max and IQ right on their forehead?

Most excellent. Now, glasshoppah, you are ready to lift the bowl of very hot red coals. Try this

Load More