A Kick in the Rationals: What hurts you in your LessWrong Parts?

by sixes_and_sevens1 min read25th Apr 2012194 comments


Open Threads
Personal Blog

A month or so ago I stumbled across this.  It's a blog piece by one Robert Lanza M.D., a legitimate, respected biologist who has made important contributions to tissue engineering, cloning and stem cell research.  In his spare time, he is a crackpot.

I know I shouldn't give any of my time to an online pop-psychology magazine which has "Find a Therapist" as the second option on its navigation bar, but the piece in question could have been *designed* to antagonise a LessWrong reader: horrible misapplication of quantum physics, worshipful treatment of the mysterious, making a big deal over easily dissolvable questions, bold and unsubstantiated claims about physics and consciousness... the list goes on.  I'm generally past the point in my life where ranting at people who are wrong on the internet holds any appeal, but this particular item got my goat to the point where I had to go and get my goat back.

If reading LW all these years has done anything, it's trained me to take apart that post without even thinking, so (and I'm not proud of this), I wrote a short seven-point response in the comments lucidly explaining its most obvious problems, and signed it Summer Glau.  It got removed, and I learned a valuable lesson about productively channeling my anger.

But this started me thinking about how certain things (either subjects or people) antagonise what I now think of as my LessWrong Parts, or more generally cause me distress on an epistemic level, and what my subjective experience of that distress is like so I can recognise and deal with it in future.

I've seen a few other people make comments describing this kind of distress, (this description of "being forced to use your nicely sharpened tools on a task that would destroy them" seems particularly accurate).  Common culprits seem to be critical theory, postmodernism and bad philosophy.  I've also noticed some people distress me in this fashion, in a way I'm still struggling to characterise.

Who else has this experience?  Do you have any choice examples?  What hurts you in your LessWrong Parts?


194 comments, sorted by Highlighting new comments since Today at 5:54 PM
New Comment
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I had a dream this post was promoted and got 172 karma, spawning another post calling for DDOS attacks and other cyber-terrorism on Psychology Today by LW. Eliezer promoted that article too, and LW went to war with them. Eliezer got arrested and LW was shut down. It was weird.

-2sixes_and_sevens9yProphetic, maybe :-)
5RobertLumley9yLet's hope not.

Oh, lots of things. "Suspension of moral disbelief," I suppose, causes me to rage the hardest inside, though I rarely get in arguments over it. There's too much inferential distance to close before people change from defense/rationalization mode to actually-goddamn-thinking-about-it mode. So I don't generally go about to my family members screaming "YOUR GOD CONDONES RAPE!" even though every time I hear an argument about how morality comes from god, my blood boils.

Invest in people proportionate to your expected return. Your prior on returns should be very low, most people are a waste of time and resources (Specifically for this particular example, your investment was your emotional reactance to things they do). Low but still positive, so you invest a tiny bit and watch what happens. If you actually get some returns great! Repeat with a slightly larger investment. Otherwise start with a new person.

Your anger is understandable, things are more frustrating when they are close to awesome but then fail in a stupid way than things that were never awesome. These "almost awesome" things give you a glimpse of something amazing before snatching it away. Even though our platonic awesome thing never really existed we still feel loss.

5Alsadius9yUnless the argument itself is the part you get returns from. I've long admitted that arguing on the internet is utterly pointless. But so is watching TV or playing video games, and this at least makes me smarter sometimes.

I tend to lose interest after encountering something like "Our current theories of the physical world don't work, and can never be made to work until they account for life and consciousness." (The writer is mentally classified as a hopeless case, so no fun to be had.) This is probably a defensive mechanism developed after 5 years in a physics IRC channel.

Yet I still get frustrated when an apparently elementary error is committed by a person who should know better (especially if, after some careful analysis, this person turns out to be me).

And it amuses me when rationalists get frustrated at the elementary errors, and mistakenly think that they "should know better", despite the overwhelming evidence that they don't. It especially amuses me when that rationalist is me. I should know better, and I do upon reflection, but rarely do in the moment.

9NancyLebovitz9yI calm myself with the idea that if I don't know how to be more intelligent, it isn't reasonable for me to expect people who are less intelligent than I am to know how to be more intelligent.
4[anonymous]9yIt seems our brains really are built to assume short inferential distances [http://lesswrong.com/lw/kg/expecting_short_inferential_distances/].

Probably the only two examples that I can think of from my personal experience are:

1) A post that one of my old high school classmates made on face book saying (and I paraphrase): "[the existence of a personal god] is literally too good to be true, which is why we should believe it."

2) Being forced to take a class in "critical thinking" which actually turned out to utilize pretty much every dark arts technique in the book to convert you of the professor's political agenda.

2) Being forced to take a class in "critical thinking" which actually turned out to utilize pretty much every dark arts technique in the book to convert you of the professor's political agenda.

That sounds like it could be the final exam in a class on critical thinking.

3taelor9yAnother one: In the film Rear Window, the protagonist witnesses a dog sniffing around a flower bed; later, that same dog is found dead. The protagonist responds by having his girlfriend dig up the flower bed, only to find nothing. From this, he concludes that his suspicion that his neighbor is a murderer is correct, and sends his girlfirend to break into said neighbors house. Of course, by authorial fiat, he ends up being right. Also, anything that ICP has said, ever. But this [http://www.guardian.co.uk/music/2010/oct/09/insane-clown-posse-christians-god] takes the cake.

For any of you who don't get it, here's the xkcd comic he referenced.

5thomblake9yAlso a less obvious reference to this [http://xkcd.com/386/].

I wonder why your comment got deleted. Was it potentially inflammatory? Does the website not like people signing things Summer Glau when they aren't Summer Glau? (Unless you are Summer Glau, of course.) Was there a link in it? I don't want to assume they just don't like dissent, but sadly that happens sometimes. If you still have the comment I'd like to see it.

As for examples, I'm currently taking a class in "Science and Religion" that is full of minor instances of this. Yesterday I got deathism and elan vital, plus some arguing by definition over "life". Deathism is more a kick in the morals than a kick in the rationals, but it's the same feeling of "argh this is bad and there's nothing I can do about it".

this particular item got my goat to the point where I had to go and get my goat back.

I am going to steal this phrase.

and signed it Summer Glau.

Awesome. One more xkcd fulfilled, n left to go.

It was very much of the tone "I am now going to explain to you why you are wrong", but it was still civil. Rough outline:

1) Quantum mechanics does not say that.

2) Strong anthropic principle is a bold claim you've failed to substantiate.

3) Saying "our current theories of the physical world don't work" is outrageous coming from a man who attracts other objects towards him with a force proportional to the product of their mass and inversely proportional to the square of the distance between them.

4) The physical processes underlying organic life are perfectly compatible with a lawful physical universe, and fairly well understood by the standards of many academic disciplines (and you should know that, because you're actually an expert on the subject). To date, no mental phenomena have demonstrated properties that violate the laws of physics.

5) "Tree in the forest" is an artefact of the semantic history of our language, and nothing to do with physics.

6) Remaining few paragraphs are presented in a needlessly confusing way to obfuscate some fairly straightforward ideas. Obviously things we label "optical effects" require optical devices in order t... (read more)

6fubarobfusco9yUh, your mom is so massive she attracts other objects toward her with a force proportional to the product of their mass and inversely proportional to the square of the distance between them. (In other words: You do realize this sounds like the nerd version of a fat joke, right?)
7sixes_and_sevens9yThat honestly didn't occur to me when I wrote it. It was supposed to be a riff on an Ad Hominem attack, only with a factual statement about a theory of physical law and how he conforms to it.
2Normal_Anomaly9yHere I thought it was a snarky statement about how we can see the success of physical theories with our own observations, and have never observed them to fail. Triple illusion of transparency all the way across the sky!
4sixes_and_sevens9yIts purpose was to demonstrate how physical theories are demonstrably successful. Its delivery was the Ad Hominem riff.
1sixes_and_sevens9yAlso "your face is so massive..." etc.
2shminux9yCertainly comes across as condescending and indignant, thanks to the words like "needlessly confusing way to obfuscate" and "outrageous".
5RobertLumley9yWords can not express how awesome it would be if Summer Glau was a LW regular.

Words can not express how awesome it would be


How regularly would you have to post to be considered a regular?

-2MartinB9yAre you aware that they might be other people by the same name?
8RobertLumley9yYes. But it was nevertheless clear which person in the reference set of "People with the name Summer Glau" I was referring to.
[-][anonymous]9y 13

Mass Effect kicks me in the LW.

Quantum entanglement communication. AI (including superAI) all over the place, life still normal. Bad ethics. Humans in funny suits.

Your strength as a rationalist is your ability to scream 'bullshit' and throw the controller at the screen.

Eh. If violations of physics and common sense (never mind unusual cognitive-science concepts) in space opera had the ability to make me angry, I'd have to spend most of my time getting angry. Mass Effect actually seems fairly sane as space opera goes, though its handling of the robot rebellion motif is pretty damned ham-fisted.

The most recent Deus Ex game actually bothered me more, thanks to explicitly tackling transhumanist themes and then completely failing to resolve them in a way that showed any understanding of the subject matter. Very little in media irritates me more than a work rendering complex philosophical issues down into a trite moral quandary and then trying to award itself cool points for knowing about the issue at all.

3Jayson_Virissimo9yIn what way could Deus Ex: Human Revolution have "resolved" it's transhumanist themes without violating continuity with Deus Ex?
6handoflixue9yAny rationalist that can't enjoy a simple story... much less resorting to throwing their controller at the screen... has demonstrated weakness, not strength. You have cut yourself out of a huge part of culture, the "human experience", simply so that you can proclaim "bullshit!" and be angry, without affecting any actual change in the world.
7[anonymous]9ysure. I know, I should ignore the stupid things and just enjoy the art, but throwing your controller at the screen is an unfortunate side effect of breaking down compartmentalization and cultivating an aversion to bad thinking. Maybe I will be able to enjoy fiction again when I reach your level. Until then, it is a cost to be paid.
2[anonymous]9yEhn. On the other hand, different folks enjoy different stories in different ways. And may even derive some enjoyment from analyzing or looking at stories they themselves didn't actually enjoy. There's also engaging with a narrative critically; "enjoy a simple story" doesn't mean "don't think about this at all" or "have only positive to neutral reactions."
0handoflixue9yIf one genuinely enjoys throwing controllers at the screen, and is well off enough to afford the replacement TVs when one inevitably fractures from the force of the blows, sure. Personally, I got the rather strong impression that nyan_sandwich was throwing the controller because of frustration, not euphoria.
0Random8329yDefine inevitably. I don't think I could throw a controller hard enough to damage a CRT or a rear projector. These suggest designs for protective covers (for the former, put the TV behind thick curved glass; for the latter put it behind a durable plastic sheet held in a rigid frame.
0handoflixue9yThat was playful exaggeration, sorry ^.^; I am surprised to hear that a CRT is considered that durable. I can bend deadbolts and I've had friends take a metal door off it's frame, so I was raised on an odd sense of what "normal" strength is.
2Random8329yLarge CRTs are made of very thick curved glass. I once did hit one hard enough to chip it, which left a hole several millimeters deep and did not appear to affect the structural integrity of the tube. But I don't know about "that durable" - if you dropped one from a sufficient height it would surely break - but it's more a question of how much force you (or I) can throw a controller with.
0handoflixue9yMy previous basis for it was my electronics teacher talking about a friend taking one in to a shop, dropping it, and having it shatter. This would have been a height of 4-5 feet, since it was held in arms Maybe modern CRTs are thicker / more durable? Given my electronics teacher, it's also entirely possible he just enjoyed dramatic stories...
2Random8329yWell, don't forget that it will hit the ground with a force proportional to its weight. You probably wouldn't want him to have dropped it on your head - it would be a rather more unpleasant experience than having a controller thrown at your head.
-2Vladimir_Golovin9yYep. Most mass-market space operas are guilty of this. Despite having knowledge and resources to fly to other planets, humans in them still have to shoot kinetic bullets at animals. However, stories, in order to be entertaining (at least for the mainstream public), have to depict a protagonist (or a group thereof) who are changing because of conflict, and the conflict has to be winnable, resolvable -- it must "allow" the protagonist to use his wit, perseverance, luck and whatever else to win. Now imagine a "more realistic" setting where humans went through a singularity (and, possibly, coexist with AIs). If the singularity was friendly, then this is an utopia which, by definition, has no conflict. If the singularity was unfriendly, humans are either already disassembled for atoms, or soon will be -- and they have no chance to win against the AI because the capability gap is too big. Neither branch has much story potential. This applies to game design as well -- enemies in a game built around a conflict have to be "repeatedly winnable", otherwise the game would become an exercise in frustration. (I think there is some story / game potential in the early FOOM phase where humans still have a chance to shut it down, but it is limited. A realistic AI has no need to produce hordes of humanoid or monstrous robots vulnerable to bullets to serve as enemies, and it has no need to monologue [http://tvtropes.org/pmwiki/pmwiki.php/Main/EvilGloating?from=Main.Monologuing] when the hero is about to flip the switch. Plus the entire conflict is likely to be very brief.)
9pedanterrific9yHow is this a utopia?
8sixes_and_sevens9yData from Star Trek doesn't quite give me the lurching despair I was thinking of when I wrote the original post, but he does make me do a mental double-take whenever a physical embodiment of human understanding of cognition sits there wondering about esoteric aspects of human behaviour that were mysterious to sci-fi screenwriters in the early 1990s.
2Strange79yTo be fair, he didn't actually have access to Soong's design notes.
1sixes_and_sevens9yData's awareness of his own construction varies as befits the plot. My point was that TNG often asked a lot of questions about ethics and cognition and personhood and identity. Data himself talks about the mysterious questions of human experience all the bloody time. In a world where Data exists, significant headway has been made on those questions already.
1TheOtherDave9yThis is a special case of a general property of the Star Trek universe: it exhibits a very low permeability to new information. Breakthroughs and discoveries occur all over the place that have only local effects. I've generally assumed that there's some as-yet-unrevealed Q-like entity that intervenes regularly to avoid too many changes in the social fabric in a given period of time.
1Strange79yThe Federation government being deeply corrupt would also explain a lot.
3[anonymous]9yBwahaha. Have you seen the end of mass effect 3? The "win" is worse than letting the bad guys do their thing.
-1RobertLumley9yCan you rot13 the ending for us? I've never played it and never intend to, but I wouldn't mind knowing what you're talking about.
3khafra9yN zvyyvbaf-bs-lrnef-byq fhcrevagryyvtrapr inyhrf yvsr, ohg unf qrgrezvarq gung gur bayl jnl gb fhfgnva yvsr va gur tnynkl vf gb crevbqvpnyyl jvcr bhg nqinaprq pvivyvmngvbaf orsber gurl varivgnoyl frys-qrfgehpg, qrfgeblvat tnynpgvp srphaqvgl. Gb qb guvf, vg perngrq avtu-vaihyarenoyr znpuvarf gung fjrrc guebhtu rirel 50,000 lrnef naq fcraq n srj praghevrf xvyyvat rirel fcrpvrf ng xneqnfuri 1 be terngre. Sbe gur cnfg srj plpyrf, betnavpf unir znqr cebterff gbjneq fgbccvat gur znpuvarf. Gur fhcrevagryyvtrapr nqzvgf gb lbh gung gur fbyhgvba vf ab ybatre jbexvat, naq bssref guerr pubvprf: (1) betnavpf qbzvangr znpuvarf, (2) xvyy nyy NVf, (3) "zretr" betnavpf jvgu NVf. Arvgure gur tnzr abe gur fhcrevagryyvtrapr vzcyvrf gung pvivyvmngvba jvyy abg frys-qrfgehpg, qrfgeblvat tnynpgvp srphaqvgl.
2[anonymous]9yGung'f abg nyy gub. Nyy fbyhgvbaf vaibyir gur qrfgehpgvba bs gur pvgrqry naq znff erynlf, juvpu ner gur onfvf bs tnynpgvp pvivyvmngvba. Jvgubhg gurz gur rpbabzl jvyy gbgnyyl zryg qbja, abar bs gur syrrgf jvyy or noyr gb rfpncr gur fby flfgrz, naq ovyyvbaf bs crbcyr jvyy or fghpx va cynprf jvgu ab pbzcngvoyr sbbq. Znff fgneingvba rafhrf. Naq gung'f vtabevat gung gur qrfgehpgvba bs n znff erynl perngrf na rkcybfvba ba cne jvgu n fhcreabin, jvcvat bhg gur ubfg flfgrz. Fb onfvpnyyl rirelbar qvrf, naq pvivyvmngvba arire erpbiref. Vs lbh unq yrg gur erncref fgrnzebyyre pvivyvmngvba, gur arkg plpyr jbhyq unir orra noyr gb qrsrng gurz naq ohvyq n creznanag pvivyvmngvba orpnhfr bs jneavatf cynprq nyy bire gur cynpr ol bar bs gur punenpgref.
-1khafra9yGung'f abg nyy gub. Nyy fbyhgvbaf vaibyir gur qrfgehpgvba bs gur pvgrqry naq znff erynlf, juvpu ner gur onfvf bs tnynpgvp pvivyvmngvba. Jvgubhg gurz gur rpbabzl jvyy gbgnyyl zryg qbja, abar bs gur syrrgf jvyy or noyr gb rfpncr gur fby flfgrz, naq ovyyvbaf bs crbcyr jvyy or fghpx va cynprf jvgu ab pbzcngvoyr sbbq. Znff fgneingvba rafhrf. Naq gung'f vtabevat gung gur qrfgehpgvba bs n znff erynl perngrf na rkcybfvba ba cne jvgu n fhcreabin, jvcvat bhg gur ubfg flfgrz. Fb onfvpnyyl rirelbar qvrf, naq pvivyvmngvba arire erpbiref. Vs lbh unq yrg gur erncref fgrnzebyyre pvivyvmngvba, gur arkg plpyr jbhyq unir orra noyr gb qrsrng gurz naq ohvyq n creznanag pvivyvmngvba orpnhfr bs jneavatf cynprq nyy bire gur cynpr ol bar bs gur punenpgref. Tbbq cbvag. Fvapr gur pvarzngvp raqvat fubjrq crbcyr gung unqa'g tbggra fhcreabin'rq, V nffhzrq gurl fbzrubj qvq n tenprshy fuhgqbja ba gur znff erynlf, hayvxr gur Ongnevna fbyhgvba. Ohg rira tvira gung, gurer'f qrsvavgryl n znffvir syrrg fghpx va gur Fby flfgrz naq ab zber vagrefgryyne genqr. Bu, jryy. Ng yrnfg gur zhygvcynlre'f rguvpnyyl qrsrafvoyr (V tb ol rg wnlarf gurer).
0RobertLumley9yI'm confused as to why this was downvoted - was it because it was an inaccurate summary?
0loserthree9yPerhaps because the quote was misformatted or because the poster advertised their multi-player handle.
0pedanterrific9yI don't know either, but it isn't inaccurate.
2Logos019yThere is Friendliness and there is Friendliness. Note: Ambivalence or even bemused antagonism would qualify as Friendliness so long as humans were still able to determine their own personal courses of development and progress. An AGI that had as its sole ambition the prevention of other AGIs and unFriendly scenarios would allow a lot of what passes for bad science fiction in most space operas, actually. AI cores on ships that can understand human language but don't qualify as fully sentient (because the real AGI is gutting their intellects); androids that are fully humanoid and perhaps even sentient but haven't any clue why that is so (because you could rebuild human-like cognitive faculties by reverse-engineering black-box but if you actually knew what was going on in the parts you would have that information purged...) -- so on and so on. And yet this would qualify as Friendly; human society and ingenuity would continue.
[-][anonymous]9y 12

Fatigue. Large amounts of depressing fatigue.

It's particularly bothersome because I just recently got a very good example of how irrational it makes me. This entire post was originally written before some of the coffee I had kicked in. I was typing up my post, and I read it, and I thought there was a good chance people were going to worry about me being suicidal. And then the caffeine kicked in, and I felt more awake, and I thought "Well, that's not very descriptive. I'm depressed, but I'm not THAT depressed." and then I rewrote everything. And then I realized what I was doing, and then I had to rewrite everything to acknowledge both states.

Basically, the knowledge that "I'm entirely irrational while I'm worn out" and "I'm worn out most of the time." put together, hurts me quite a bit in my Less Wrong parts. Of course, it might just be the availability heuristic. I might actually be less worn out then I remember. But then that brings up "A substantial majority of my recent memories seem to be of me being worn out/irrational." as it's own separate problem.

Using your tool analogy, it would best be described as "My tools are dull. Sometimes... (read more)

I know how you feel. I get so much stupider and sadder when I'm tired. Have you found any solutions? I've tried naps and mid-afternoon exercise and dietary changes. The only thing that's ever helped in the long term was giving up coffee 3 years ago - the crashes after the caffeine high were making everything so much worse. It took a lot of nail-biting but it was worth it.

On the plus side, at least you recognise its happening to you so you can try and make sure you don't make important decisions in this state.

2[anonymous]9yTaking time off of work. Thankfully, paid time off is available to me. Unfortunately, I recognize it's happening only some of the time. But important decisions seem to happen to be so frequently I sometimes wonder if my importance sensor is calibrated incorrectly. I think I need to learn to pace myself better. I remember feeling in a very similar mood to this a while ago, reading some pointing out that people in this set of circumstances needed to learn to pace themselves better, and thinking something along the lines "Of course! That's exactly what I need to do!" and feeling inspired. And now, here I am, weeks later, complaining of a surprisingly similar problem... I think this is evidence I didn't actually learn the lesson of how to pace myself properly. Edit: Based on this link, I apparently had this realization about pacing not even two months ago. http://lesswrong.com/lw/aks/emotional_regulation_part_i_a_problem_summary/5z6l?context=1#5z6l [http://lesswrong.com/lw/aks/emotional_regulation_part_i_a_problem_summary/5z6l?context=1#5z6l] So I've updated my earlier comment to "weeks" later, and not months.
2Hermione9yGood point. I have a tendency to treat the marathon like a sprint. Any plans for how to improve your pacing? You've inspired me to come up with a mental list of "warning signs" that I should use as an indication I need to drop my hours for a while. (I'm thinking: skipping meals, drops in concentration and finding it harder to keep my temper).
0[anonymous]9yMy current plan is to make an effort to relax, specifically by beating a type of vague fear about "But what if I'm slacking off on X and no one is telling me! I'll (get fired/divorced/socially crushed) I have to work harder on the off hand chance that happens!" that I have periodically, particularly because, I have NO evidence of this fear, which hopefully will make it easier to beat. All evidence points towards other people letting me know when I get anywhere near a lack of acceptable effort. If people think I'm slacking off, they'll let me know, like they have in the past. I have no need to work at above the pace I can keep at all times on the off hand chance someone might be quietly fuming about me being lazy and going from everything looks fine to unrecoverable/horrible in seconds.
5moridinamael9yRelated to this, what kicks me in the Less Wrong Parts is that I can be in a bad mood and thinking irrationally, be aware that I am in a bad mood and thinking irrationally, and helplessly watch myself continue to think irrationally. Moods, for me, are very sticky, and any strategy I develop for extricating myself from a foul mood ends up only working within the context for which it was designed. I feel like if I got a handle on my moods, my demonstrated rationality would skyrocket. It might help to mention that I am not depressed or even unusually moody. In fact, I'm more even-keeled than average. Maybe this is what makes it feel that much worse when I do find myself in a foul mood. It is an unaccustomed state I don't know how to deal with.
1Strange79yHave you tried designing strategies specifically so that they wouldn't work in the context where you're designing them, and then running tests on those? Say, leave a post-it note somewhere visible saying "you are in a bad mood, and will respond to this observation with irrational anger," then updating the last bit recursively until it's accurate enough that the tired, stupid version of you is forced to agree, or is at least thrown off-balance enough to break the behavioral pattern.

I enjoy meditation, especially group meditation. It calms me down and helps me stay a bit more focused. I just want to do without the new age hippy bullshit. My eyes start to glaze over when people start to talk about God, chakras, and auras.

I've noticed many people who practise meditation have a strong belief in meditation and the more 'rational' core of Buddhist practices, but only belief in belief about the new age-y aspects. My meditation teacher, for example, consistently prefaces the new age stuff with "in Buddhist teachings" or "Buddhists believe" ("Buddhists believe we will be reincarnated") while making other claims as simple statements of fact ("mindfulness meditation is a useful relaxation technique").

9loserthree9yI used to believe in God and pray often. When I discovered that I had stopped believing, I stopped praying. In a month or two, I felt the lack strong enough that I felt I needed to do something about it. So I began to go through the motions and find what parts work. It turns out that I can perform prayer-like-actions without invoking a higher power and get similar results. A couple years have allowed me to refine the process, somewhat, and now I can feel the rightness, acceptance, and love-like-experience I once called 'being filled with the Holy Spirit' much more easily than I could when I had to first abase myself before God's judgement. And I call that progress.

I wrote a short seven-point response in the comments lucidly explaining its most obvious problems, and signed it Summer Glau. It got removed, and I learned a valuable lesson about productively channeling my anger.

I'd look at what you would have spent your time doing otherwise before you necessarily say (you didn't, but implied) that was a waste of time. Even if exactly 1 person read it (the one who deleted it), you got practice expressing yourself and (hopefully) coherently forming an argument.

Horrible lurching realisation: over the past couple of years, I've avoided unproductive online arguments. This has made me happier. I've also felt, over the past 18 months or so, that my general argument-forming faculties have gotten a little bit shoddy.

This is worrying.

that my general argument-forming faculties have gotten a little bit shoddy.

So long as your truth-discovering faculties are getting enough exercise, that seems like a fair trade.

5Fhyve9ySince reading Lesswrong, I try to argue more. But not to win, rather, from the point of view that I am trying to understand how they think and to modify how they think. Lesswrong has allowed me to concede points that I don't agree with, because I know that I can't change their mind yet. It's fun.
0RobertLumley9yMy experience tends to be in line with yours.

The other day someone in a class mentioned that intelligence is in the soul, and that humans are rational beings because of this. I politely interjected, explaining cognitive biases, that humans are not inherently rational, and often fail to analyse situations.

Examples of brain damage patients would also prove your point nicely, perhaps more saliently.

7Michelle_Z9yThank you, I will keep that in mind.
4RobertLumley9yHemineglect [http://www.scholarpedia.org/article/Hemineglect] is my personal favorite (boy it feels wrong saying favorite) example of this. Those with hemineglect are largely or completely unaware of the side of the universe contralesional to their brain damage. They can pick up a chair on one side of their body, put it on the other, and when you ask them where the chair is, they say "What chair?" To make matters worse, they also often insist that they have no deficit in function. It's astounding.
2NancyLebovitz9yHemineglect might fit with the "are there (true) thoughts we cannot think?" discussion.
2arundelo9ySomeone I used to know said that the brain-based nature of the mind was brought home to him when he had a stroke and his personality changed. (The only specific example I can remember is that he went from loving science fiction to having no interest in it.)

My choice example is dilettantes who learned from other dilettantes pontificating with supreme confidence about the subject matter they know little about (Hello, MWI!).

Oh, I got another one, mostly confined to this forum: people making up numbers for probabilities of certain events and feeling so much more Bayesian.

I've occasionally been guilty of that, but I see it less as a magical talisman of +1 to prediction skills and more as a means of improving my calibration after the fact: if I discover I've been systematically overweighting (or, much less likely, underweighting) the probability of some category of events, that's a good clue that my thinking about them is flawed in some way. Can't do that if I don't quantify my uncertainty in the first place, although of course the flaw might come in at the quantification step.

This only works if you actually go back and collect that data, though, and I haven't generally been very good about using PredictionBook or any similar tools.

2Jayson_Virissimo9yHow are we supposed to get better at quantifying our degree of belief without practice [http://predictionbook.com/]?
2David_Gerard9yYou're not, which is why not keeping track of the results is a way of doing it wrong. (Not that I do it, but then I don't assign spurious numbers to my feelings either. Possibly I should, but if I do then I need to keep track.)
6TheOtherDave9yI mostly approach this as a set of jargon words that express finer gradations of confidence than the conventional language. That is, in normal speech I use tags like "I suspect that X," "I expect that X," "I'm fairly confident that X," "I doubt X", etc. On LW I use "I'm N confident that X" instead, where N is typically expressed to one significant figure (except I use ".99+" to denote virtual certainty). I endorse that, although I also endorse remembering that what I'm talking about is my intuitions, not reality. That is, when I say I'm .7 confident that it's going to rain this afternoon, I have said something about my mind, not about rain. I do find that the exercise of thinking more precisely about what my intuition actually is is helpful in encouraging me to pay more attention. That is, trying to decide whether I'm .6 or .8 confident in X (or whether all I can really say is that I'm .6-.8 confident) is a meaningful exercise in clarifying my own thoughts about X that I'm not as encouraged to do if my lexical habit is to say "probably X."
0Dustin9yI do this in real life quite often. But I always try to explain that I'm talking about my state of mind. I occasionally get good reactions to this along the lines of whomever I'm talking to not having ever thought about the distinction between rain and what your mind thinks about rain.
1[anonymous]9yI express completely or nearly-completely subjective degrees of belief as numbers (and hear people doing the same) so often that I sometimes forget that not everybody does that, and take a while to realize why people hearing us talk look startled. (I guess that's what happens when most of the people I hang around with are physicists, and many of those who aren't are into sports bets and/or poker.) I don't think I “feel so much more Bayesian” for that (except to the extent that I know frequentists would feel an even more painful kick), though; I mostly take that to be a figure of speech, as TheOtherDave says [http://lesswrong.com/r/discussion/lw/c08/a_kick_in_the_rationals_what_hurts_you_in_your/6grj] .
9TheOtherDave9yPossibly related... I developed the mannerism with my husband of answering rhetorical questions with oddly precise arbitrary numbers a while ago, mostly as an expression of dislike of rhetorical questions. (As in "How weird is that?" "Twelve.") It amuses him, lets me blow off steam, and really bewilders listeners.
9Dustin9yI've done this for years, and "Twelve" is my go-to number. So much so, that my wife often preempts my retort with "And don't you say 'twelve'!". Sometimes I throw a "Seven" in there, just because I'm a wild and crazy guy.
4TheOtherDave9yI'm amused; my husband and I have precisely that dynamic. And I use seventeen as my alternate meaningless number, in almost exactly the same way.
0loserthree9yIs that because of the humbug?
2wedrifid9yThat's a really good idea. I'm going to make it my new policy!
0othercriteria9yThis only really irritates me when the person stating the estimate feels obligated to put a "p" somewhere, e.g., "I believe blah with p = 0.04" or "Blah has probability (0.05 < p < 0.1)". This just signals a confusion of frequentist p-values and subjective-ish Bayesian prior probabilities, and indicates to me no real understanding of either of them.
0loserthree9ySome of us do so because we have been or expect to be asked to do so, or both. Some others maybe just want to fit in, as they entirely knowingly affect. Products of the environment, you know.
0[anonymous]9yThat really gets me.
2shrink9yOne basic thing about MWI is that it is a matter of physical fact that large objects tend to violate 'laws of quantum mechanics' as we know them (the violation is known as gravity), and actual physicists do know that we simply do not know what the quantum mechanics works out to at large scale. To actually have a case for MWI one would need to develop a good quantum gravity theory where many worlds would naturally arise, but that is very difficult (and many worlds may well not naturally arise).
2shminux9yI cannot agree with this assertion. Except for the mysterious "measurement" thing, where only a single outcome is seen where many were possible (I'm intentionally use the word "seen" to describe our perception, as opposed to "occurs", which may irk the MWI crowd), the quantum world gracefully turns classical as the objects get larger (the energy levels bunch tighter together, the tunneling probabilities vanish exponentially, the interaction with the environment, resulting in decoherence, gets stronger, etc.). This has not been shown to have anything to do with gravity, though Roger Penrose thinks that gravity may limit the mass of quantum objects, and I am aware of some research trying to test this assertion. For all I know, someone might be writing a numerical code to trace through decoherence all the way to the microscopic level as we speak, based on the standard QM/QFT laws.
-1shrink9yLook up on quantum gravity (or rather, lack of unified theory with both QM and GR). It is a very complex issue and many basics have to be learnt before it can be at all discussed. The way we do physics right now is by applying inconsistent rules. We can't get QM to work out to GR in large scale. It may gracefully turn 'classical' but this is precisely the problem because the world is not classical at large scale (GR).
3shminux9yI am well aware of the QG issues. That was not my point. I will disengage now.
2David_Gerard9yTo be fair, a lot of actual physicists feel that way when they hear the word "quantum". It's definitely one of the common cases of things that make people feel this way.
1lsparrish9yArmchair critics are often just as bad. For example, there are all thosepeople who insist [http://www.quora.com/How-would-you-answer-the-following-life-extension-situations-1-Youre-90-years-old-aware-that-youre-going-to-die-soon-You%E2%80%99ve-had-a-rewarding-social-and-intellectual-life-Even-better-you%E2%80%99re-a-billionaire-with-%E2%80%9Cunlimited%E2%80%9D-budget-A-cryonics-sales-guy-comes-to-your-door-Due-to-technical-reasons-he/answer/Eric-Griffiths] that "cryogenics" doesn't work because ice crystals inevitably explode all the cells. The much harder thing to accept however is when experts and public faces of the field fail [http://blog.ciphergoth.org/blog/2010/02/07/survey-anti-cryonics-writing/] to correct public opinion on the matter.

LW has made me feel this way about general run-of-the-mill Internet stupidity.

Not to mention the pain of the older teen coming out with mindboggling failures of logic when she was studying A-level Philosophy. And not doing too badly on it. (Some LW readers on my Facebook will recall some stunningly scrambled thinking about religion.) Kids, eh.

Kids, eh

Hey, you're the one who sired a human. Don't go blaming her for the resulting failure to think clearly.

She's the loved one's daughter, not mine. My genes are innocent of that one!

Now, the 4yo who is really obviously mine ... she's already got the clever rhetorician skills down pat.

0arundelo9yThis reminded me of Ayn Rand on Immanual Kant [http://aynrandlexicon.com/lexicon/kant,_immanuel.html]: The article you linked [http://web.maths.unsw.edu.au/~jim/worst.html] does mention Kant, though apparently Stove was easier on him than Rand was:

A friend of mine goes to The North American Institute of Medical Herbalism. Today, she and her classmates tried five different "flower essences" (made in basically the same way as homeopathic medicine) and talked about their reactions in what was described as a double-blind trial. Naturally, they all experienced very similar and significant effects from each essence. It's too bad they can't get anyone to thoroughly document these double-blind trials they keep running on energy medicine!

Various cases of NPD online. The NPD-afflicted individuals usually are too arrogant to study or do anything difficult where they can measurably fail, and instead opt to blog on the topics where they don't know the fundamentals, promoting misinformed opinions. Some even live on donations for performing work that they never tried to study for doing. It's unclear what attracts normal people to such individuals, but I guess if you don't think yourself a supergenius you can still think yourself clever for following a genius whom you can detect without relying o... (read more)

You know, an uncharitable reading of this would almost sort-of kinda maybe construe it as a rebuke of the LW community. Almost.

3shrink9yIt's more a question of how charitably you read LW, maybe? The phenomenon I am speaking of is quite generic. About 1% of people are clinical narcissists (probably more), that's a lot of people, and the narcissists dedicate more resources to self promotion, and take on projects that no well calibrated person of same expertise would attempt, such as e.g. making a free energy generator without having studied physics or invented anything not so grandiose first.
1David_Gerard9yUnfortunately, it's actually a generic phenomenon, as I note [http://lesswrong.com/lw/c08/a_kick_in_the_rationals_what_hurts_you_in_your/6gpi] (and that it is is important).
0TheOtherDave9yWhich of course doesn't preclude (the possibility of) shrink's comment being intended as a rebuke of the LW community.
0David_Gerard9yWell, yeah. But it really isn't just that, but a pattern.
0shminux9yOoh, burn! Your last link explains the ire I expressed in my other comment [http://lesswrong.com/lw/c08/a_kick_in_the_rationals_what_hurts_you_in_your/6gmh] , thank you.
0David_Gerard9ySounds like the LaRouche cult. edit: that last link is excellent. The ingroup-outgroup thing gone pathological. All the ingroup needs is a defined enemy and WHAM! cult.
-4metatroll9yOMG, is that the real "shrink"? If so - we're not worthy! I own a copy of every one of your books.

The.thing that gets me is in movies or books or whatever. When they do the thing that makes no sense but holds the plot together my brain screams. Not "hold in that's too stupid to exist" my brain says, "Oh, thia is a movie about stupid people doing stupid things" For example in my head in the movie Avatar was about some incompotent division of a mineing corp useing mickeymoused old bootleg clones to try to mine a terrible little planted off in some desolate corner of nowhere before I they get shut down on safety violations

5David_Gerard9yThe idiot plot [https://en.wikipedia.org/wiki/Idiot_plot], in which every character needs to be an idiot for the plot to work. See also the "second-order idiot plot", in which not merely the characters but everyone in the society needs to be an idiot for the plot to work.
5JoshuaZ9yThank you for linking to Wikipedia and not TVTropes.
3Kaj_Sotala9yOut of curiosity: are you actually grateful, or just saying that as a joke? (I'm asking this because so far I'd presumed such comments were jokes and while TVTropes was addictive, surely it wasn't that bad and you could always close it if you really had to - but I've seen enough of them to start to suspect that they might not all be jokes, and that for some people TVTropes really is that bad.)
2JoshuaZ9ySomewhat grateful. It isn't absolutely awful since I can often avoid clicking on the initial TVTropes page, but if I do click on a single TVTropes page I'm likely to get stuck there for a while. In this case, I clicked on the link once without thinking about it and then was happy to see a Wikipedia page.
5Oscar_Cunningham9yThis is essentially the premise of "Burn after Reading".
3[anonymous]9yYes, but it has a clever subtext, which makes all the difference. Cf. Jack and Jill [http://www.imdb.com/title/tt0810913/]. (I have often have a reaction similar to cloudlicker's when confronted with bad storytelling in books/movies.)
2David_Gerard9yThe TVTropes page (which I am not linking to) notes Idiocracy as an example of a plot where everyone in the society is an idiot and the premise works. If that's the actual point of the story, that's different to doing it just to give yourself a plot at all.

Intelligent Design makes me crazy. It's not that it's an argument against evolution, it's that my mind is screaming, "HOW CAN PEOPLE NOT NOTICE THAT WE DON'T KNOW ENOUGH ABOUT UNIVERSES TO TELL WHAT'S PLAUSIBLE?"

0Oscar_Cunningham9yI don't understand. I think you might have said "DON'T" when you didn't mean to.
2NancyLebovitz9yNo, I meant that if we had a bunch of universes, we'd know something about how much complexity is typical. On the other hand, this might not be enough information to judge whether all the universes had been created. However, what I've seen of Intelligent Design is the idea that it's implausible for life to have evolved on its own, and I don't think we have enough information to judge plausibility. I'm inclined to think that people who find evolution implausible don't have a feeling for the huge number of molecular interactions there have been in life's history.

The correct response to that sort of stupidity is immediate tab-closing, unless it comes from an especially important person who can be swayed. Unfortunately, this still really gets me in person, where walking away is harder.

I have a lit-crit friend who I have known for a better part of a decade. We have an ongoing struggle to understand each other, and as part of this we will occasionally trade ideas the other finds incomprehensible. As part of this cultural exchange process, she decided to send me something about one of my subjects (econ) in 'her language', and linked me to this.

Needless to say, this was like a cannonball to my LessWrong Parts.

As much as I do find this sort of stuff distressing, I also find it useful for helping me explain precisely why I'm so confident in dismissing it as informationally bankrupt. The general retort from the literary type is that these sorts of texts contain lots of specialist language and ideas, and just as you wouldn't expect a lay-person to understand a maths or physics paper off the bat, you shouldn't expect to understand something like the above.

To which I respond "my arse". Papers in disciplines I consider to be respectable, but lack any deeper knowledge of, have a recognisable argument structure, even if I don't necessarily understand the arguments. Also, any epistemology worth having should demand claims be provided with means of substantiatin... (read more)

The linked essay makes perfect sense to me; and I'm certain the only reason it doesn't to you is indeed just the jargon. I don't think it's a particularly good analysis, ultimately, but for boring reasons that would make any essay weak, not because it's saying nothing. It's also not attempting to be a knock-down argument, not on account of its theoretical stance, but because it's a short blog post firing off some impressions in a perhaps unjustifiably confident tone, which of course Manly Man Rational Economists(tm) do all the time.

That said, my acquired intuition is that within [the cluster of people/ideaspace that the typical LessWrong reader would ugh-field as "pomo"], as within many other clusters, the lack of clarity in language does certainly covary with lack of clarity in actual thought. But I can't really say how much my own tribal academic loyalties (or desire to believe that I can understand anything that means anything) have helped produce that sensation.

I can read it, and I'm pretty sure I can see what it's trying to say, but I can't find a cogent structured argument in it. More pressingly, after I tease apart its wilfully impenetrable written style, so what?

It has a reasonably coherent central idea (albeit one that could have been conveyed in a fifth of the wordcount), but it doesn't present a case for it. It just makes claims, occasionally referencing other people's claims. I can make claims too. Brigitte Bardot has a birthmark on her ankle in the shape of a duck. Does she? I dunno; maybe. It's not now your job to go away and research whether Brigitte Bardot has any birthmarks in the shape of waterfowl. It's mine. It's always mine.

It makes no effort to convince me it's not just some random stuff some dude is saying (cf. timecube.com). Lots of dudes say lots of random stuff. Why should I care about this? Why should I put it in my head and allow it to influence my expectations of what's going to happen? What's the difference between a world where this is accurately describing something and a world where it isn't?

If this structure, this mechanism for saying "here's something, and here's why" actually exists in there, please tell me where it is.

A world where this accurately describes reality, rather than one where it doesn't, is one where 1) most people consent, more or less, to the idea that they should pay their debts, 2) indebtedness is stigmatized, 3) debt is seen primarily as a relationship between individuals, 4) the indebted are less likely to be politically active than they would if they were not indebted, ..., &c. It's intended to resonate with one's phenomenal experience and background assumptions, and no, it doesn't attempt much more than that, so like I said, it's a bad argument.

This may sound like a glib remark, and it is, but it's also a legitimate query: where are they hiding all the good arguments?

My lit-crit friend, a Ph.D. student herself, presumably provided this example in the misguided hope that it would offer an insight into the value of her way of thinking. Was it just a bad choice on her part? Is there some secret trove of critical theory observations on debt that I might look at and think "woah! This is knowledge worth having"?

It's a reasonable question. First, I think that the linked example is not the best of post-modern thought.

More importantly, a lot of post-modern thought is co-opted and the label is forcibly excised. Here are some examples of what I think are good post-modern ideas.

  • There was a tendency for colonist-era Europeans to ascribe exotic virtues to Near and Far Easterners that had little relationship to the values of those communities. Orientalism is a discussion of this dynamic related to Near Eastern culture. I don't think the dynamic can be well explained by reference to in-group/out-group, but post-modernism does a good job, in my view. Consider also the phenomena of the Magical Negro (warning: TVtropes)

  • Death of the Author (TVtropes), the view that the author's opinions do not control a work's interpretations, is also heavily influenced by post-modern thought (or so I understand - I'm not very interested in most lit crit of any flavor)

  • The slogan "The personal is political" is insightful because it highlights that "political" (i.e. partisan electioneering) is not really a natural kind in political-theory conceptspace. Issues of personal identity are just

... (read more)
0sixes_and_sevens9yThank you for this. It's funny you should mention Death of the Author. I have another friend whose academic background is in literature, and he rants to the point of blind fury about how ridiculous a notion it is. I showed him the above link to get his opinion, and his most pointed comment was how the author's emphasis on academia, student debt and being forced to work menial academic positions was not a shining indictment of Roland Barthes.
4David_Gerard9yBarthes is good, comprehensible and generally on the ball. He's actually not a waste of your life to read. Start with Mythologies like everyone does. (No-one who lives on the internet would find it radical these days, but it certainly was when it was published.)
1TimS9yHe disagrees with "Death of the Author"? You've whetted my curiosity - I've always thought that it was a fairly reasonable position. Also, I don't know what you mean by "indictment of Roland Barthes"

Personally, I've long been of the opinion that Death of the Author is, if not exactly wrong, still an idea which has been more harmful than useful with respect to its effects on literary criticism.

The central idea of Death of the Author is to judge the text itself without limiting interpretation to that which is imposed by the author's intentions. There are certainly cases where one can glean valuable information from a text which the author did not consciously choose to add to their work. For instance, the author might have views on race which will leak out into their writing, in such a way that a perceptive reader will gain insight about their views even though the author did not intend to make any sort of statement about race whatsoever. However, I think that to divorce the text entirely from the context of its creation is an invitation to abuse the basic principles of communication.

As Roland Barthes put it, "To give a text an Author" and assign a single, corresponding interpretation to it "is to impose a limit on that text." But imposing limits on a text is necessary in order to extract any information from it at all. It's only by narrowing down our space o... (read more)

3NancyLebovitz9yThis is at the notion level, but I'm wondering where the meaning of a text is. Candidate theories: In the mind of the author. Readers are constructing (whether consciously or not) models of what the author had in mind. In the minds of readers. People have access to the mind of at most one reader. But that's limited fun for some kinds of analysis, so critics are apt to make guesses about the minds of a great many readers. I have respect for a professor (sorry, name and university forgotten) who asked people what they loved about the Lord of the Rings, and why they read it repeatedly. The consensus answer was the moment when Sauron's power fell. Until I wrote this, I didn't realize that this is an example of successful authorial intention-- Tolkien thought eucatastrophe was crucial.
2TimS9yFor fictional texts, I'm not sure that extracting information from the text is really the best way of thinking about the text-reader interaction. The "Wizard of Oz" movie is allegedly [http://en.wikipedia.org/wiki/The_Wizard_of_Oz_%281939_film%29#Alleged_impact_upon_the_LGBT_culture] very influential in gay culture in America. Assuming this is true, I find it implausible that this was the intent of a movie made in 1939. Does that show that the movie can't "mean" something about gay culture?
5Desrtopa9yCould you taboo "mean?"
1TheOtherDave9yWhich, I suppose, raises the question of whether there's any value to be gotten from reading fiction, and if so what it is that one is getting of value. Which might in turn raise the question of whether it's possible to get that thing-of-value from nonfiction as well, in which case perhaps extracting information from the text is perhaps not the only way to engage with nonfiction, either.
2David_Gerard9yIt was useful at the time. Remember that postmodernisms are reactions against modernisms, and reactions date badly. There are few good ideas that can't be made into bad ideas by overdoing them hard enough.
7AlanCrowe9yUpton Sinclair's book The Jungle provides a concrete example, useful for grounding a discussion of the "Death of the Author". Stealing a paragraph [http://en.wikipedia.org/wiki/Upton_Sinclair#The_Jungle] from Wikipedia Author's really do intend specific interpretations, and can notice, with disappointment, when readers impose a different interpretation by weight of numbers.
1Strange79yI have a hard time sympathizing with Upton Sinclair's complaint about the specifics of how his resounding success was implemented. He thought unregulated capitalism was bad, and explained why; people agreed, and tore down the "unregulated" part.
0David_Gerard9yYep. Any writer of fiction needs to understand the concept of "death of the author", even if they don't call it that - the text as all the reader has to go on.
7sixes_and_sevens9yThat's because it's late and I used a word with exactly the opposite meaning I was reaching for. The sentiment I was trying to get at was the author's emphasis on debt as it impacts a doctoral candidate, (as opposed to, say, a blue collar worker who has to moonlight in a petrol station to cover their loan repayments), does not bode well for the idea of divorcing an author's personal opinions and circumstances from the interpretations of their work. The way my friend has always described Death of the Author (we lived together at University, so I've been treated to this for a while) suggests he has been subject to a much, much stronger form than the one you describe, in which an author's motives, intentions and circumstances should be completely disregarded when interpreting their work. (I recall he once wrote a poem which, in a pleasantly Godelian fashion, described his own motives and intentions while writing it. He hoped one day to become famous enough as a poet that English undergrads would be forced to try and analyse it without reference to his motives and intentions for the piece. There's a reason we've remained friends for so long.) Coming at it from an information theoretic perspective, Moby Dick is clearly not talking about the Soviet Union's occupation of post-war East Germany. Part of my certainty in that statement involves facts about the knowledge and history of the author (most saliently that he wrote the book, and died, in the 19th Century). Any implementation of Death of the Author strong enough to say "that doesn't matter: Ahab is totally Stalin" is not an implementation I can really get behind.
0[anonymous]9yI like this account of intentionalism [http://ethicalwerewolf.blogspot.com/2007/08/defending-intentionalism.html].
3Oligopsony9yWell, I think you need to more rigidly designate "they," but since neither debt nor literature-department-influenced frameworks are my actual bailiwick I can't in confidence help you out here. I do find that some of the Big Inscrutable Names, like Foucault and Althusser, can be useful to think with. (If it sounds as though I'm being a really poor defender of Critical Whatever it's because I'm not, really; I'm more used to being typecast as the guy who thinks the cultural turn was bullshit. But it's not as bullshit as most intelligent outsiders assume, is my incredibly modest claim.)

If the linked essay make perfect sense to you, perhaps you can explain this sentence

In capitalism, all debts finally break free from the sovereign and become infinite by conjoining flows.


If we took fifty literature postgrads from across the English speaking world, and asked them to explain the sentence, would they give consistent answers?

8chaosmosis9yIf they were familiar with the way Deleuzians phrase things then about 80% would, is my guess. Mostly the quality of postgrads is pretty poor because lots of philosophy professors suck, which influences this. I got the same interpretation as Tim S though. I've read some D(&G) stuff before. "Infinite" is just Deleuzians being overdramatic and imprecise with language. Or, perhaps they're not trying to convey the logic of the argument so much as the idea or feel of the argument. Deleuzians often have a hard time seeing the division between things like logic and persuasion and bias. They're right insofar as there is no hard concrete division between those things, but it sometimes makes them lazy. RE: Below comments: "flows" mean something specific within Deleuzian terminology. It implies interconnectedness and chains of causality with uncountable numbers of variables interacting with whatever it is that they're talking about. It also has implications related to perceiving objects as dynamic rather than as static. Once you understand the jargon and have read his arguments a bit it's actually sort of pleasant to read Deleuze's stuff. His frequent use of metaphors allows him to make subtle references to other comments and arguments that he's made in the past. It's like how jargon is useful, except the benefit is not precision but is rather the breadth of meaning which each phrase can convey. Also, it's almost never that the associations of arguments invalidate the misinterpretation, but that the misinterpretation overlooks specific shades of meaning. It's difficult to interpret on some rare occasions but once it's interpreted there's a lot of meaning in it. Most of the Deleuzian secondary authors suck though. They give me headaches.
4TimS9yEven as a post-modernist, I wouldn't say I'm impressed with the average post-modern thinker. In other words, I don't know the answer to your question, and am not confident that it would reflect well on post-modern thought. I will say that post-modern art theory (as opposed to political theory) is least impressive to me. It always seemed to me like art critics have already said all the interesting things that aren't post-modern, so post-modern literary criticism is the only way to say something new. And if it isn't new, it doesn't get published. But this is an uninformed outsiders impression.
1David_Gerard9yIn my rock critic days I found it a useful tool in writing about and understanding pop culture. ('80s British pop music is what you'd get if you tried monetising postmodernism, and I don't just mean ZTT.) It's the sort of thing you really want to have a use for before you bother with it more than casually. (I still think in terms of critical understanding of stuff all the time and read books of criticism for enjoyment, even of artistic fields I know nothing about. I realised a while ago that if I were doing for a job the thing I would be best at, I'd be a professor of critical theory and paid a lot less than I am as a sysadmin.)
2NancyLebovitz9yShould the test be done by asking postgrads or professors? Why one or the other?
6sixes_and_sevens9yI chose postgrads because the counterpoint would be asking, say, statistics postgrads what a moderately arcane piece of stats terminology means in context. We then have the extra avenue of asking professors. The stats professors should give answers consistent with the postgrads, because stats terminology should be consistent in the public domain; the professors may know more about it, but they don't have any normative influence as to what the terminology means. Will the literature professors have answers consistent with, but more knowledgeable than, their postgrad students, or will they be something different altogether?
0siodine9yIs there a good pomo vocabulary guide somewhere? (I'm assuming 'sovereign' and 'conjoining flows' are pomo jargon)
7TimS9yI'm not aware of any special meaning for "conjoining flow." I assumed it was a metaphor and interpreted it in light of the next sentence in the essay. Post-modernism loves metaphor and hyperbole, for better or worse. I readily acknowledge that frequent use of those styles impedes readability.
6thomblake9yNot pomo jargon. It just means the supreme authority, like the King or the State. Used extensively in Political Science.
2siodine9yWhat? That's not answering my question (at least, why ignore 'cojoining flows'?). And I get what sovereign means in this context like I get what synergy means among management, but 'synergy' is still management jargon.

at least, why ignore 'cojoining flows'?

If you ask two questions in one comment, and someone knows the answer to one of the questions, what would you like that person to do?

0siodine9yMy bad, I confused TimS with thomblake (because their names are so similar). I wrongly thought TimS was only explaining what sovereign meant even though they interpreted 'cojoining flows' somehow. But even so, sovereign could still be jargon unless thom is familiar enough with pomo to say otherwise--it's not enough that it's used in other contexts as well (I thought it might be jargon because I've heard continental philosophers using it often enough before).
2TimS9yBut post-modernism is a type of political theory. Therefore, it borrows some jargon from more mainstream political theory. It's also a type of literary criticism theory. As applied to literary criticism, it doesn't impress me, but most literary criticism doesn't impress me, so that's not a very meaningful statement.
0siodine9yHas there been much cross pollination between post-modernism and competing or parallel schools of thought (in say the last couple decades)? (I'd think there would be a language and tribal barrier preventing or largely limiting that.) If not, do you think the latest and greatest of post-modern thought ought to have a significant impact in other areas?
1TimS9yIs this [http://lesswrong.com/lw/c08/a_kick_in_the_rationals_what_hurts_you_in_your/6gdu] a partial answer to your question?
2siodine9yNot really, but maybe. I think (could be a common misconception) you could have added that post-modern thought helped the sciences realize their prejudices (misogyny, ethnocentrism, and so on). And so when I take all those accomplishments together it starts look like post-modernism acts as a meta-critic for the practices or structure of various fields. Does this sound right? If so, has it had any recent accomplishments (i.e., is it decaying)?
4David_Gerard9yIt sounds like the ideal of what it should be. I think it's got some usefulness in this direction. But even when I defend PM as not being 100% bullshit, I have to take care to note that it's 99% bullshit. A lot of it is academic performance art.
2Tyrrell_McAllister9yI think that this is a very good first-pass definition of post-modernism, or at least of its goals.
3dlthomas9yI didn't read the essay, but taking a swing at the sentence, it could be a reference to the lending and re-lending of fractional reserve banking creating a larger money supply than what was issued by the sovereign. I'm not sure where "infinite" enters into it, though... maybe it is meant to mean "unending" rather than "innumerable"?
2Oligopsony9y"Back in the day, in Hanson's farmer epoch, public morality was maintained in part by cultivating in people a sense of gratitude towards God/the universe/society/one's parents/the resident nepotist with a sword; that their existence entailed debts that in principle couldn't be repaid. Nowadays under liberalism we've in principle thrown that out but everybody's still linked in a web of very explicit debts, and the web doesn't in principle have a center." (You might say that this is a massive oversimplification to the extent that it's true, and you'd be right, of course.)
0David_Gerard9yOh, Deleuze. D+G are interesting and thought-provoking but all but impenetrable. Imagine a textbook written in the style of an experimental novel. Surprise! That's actually your textbook. As with Derrida, I suggest you start with others' synopses, not the originals. Alternately, if you don't have any actual reason to read it (such as actually having a use for something that could reasonably be termed "critical theory"), just throwing them against the wall will likely save a lot of time.
1sixes_and_sevens9yWhat uses do people normally have for something termed "critical theory"?
6beoShaffer9yRent seeking, signaling(mostly to pretty specific groups), fun.
1David_Gerard9yPretty much. I have always tried to keep in mind that the actual justification of criticism is turn people onto good stuff and warn them off bad stuff.
0Normal_Anomaly9yIt can also be justified by helping people who liked or didn't like something understand why, so they can seek or avoid those qualities in other works.
0David_Gerard9yOh yeah, turning noise into music. I think that's covered, though.
3David_Gerard9yIn my case, being a rock critic. (The money is much better in IT.) But really that I'm interested in art criticism for its own sake and read it for entertainment even when I know little or nothing about the art in question.
[-][anonymous]9y 1


The most astounding fact about the universe is the knowledge that everything we perceive—color, sound, and even energy itself—is a process that involves our consciousness.

hurt you in your LessWrong Parts? (the article's conclusion)

9faul_sname9yOnly one word needs to be added for it to be accurate (if unexciting).
0[anonymous]9yPretty exciting the first time it occurs to you? It's an article in "Psychology Today. com" This is admittedly beside the point of the main post.
8Alsadius9yThat barely qualifies as insight when you're high.
0[anonymous]9yYou forgot the "do not exist outside your consciousness" part. Or is that wrong? The upvotes on your post tell me I'm missing something... but to me what you quoted is different, and yes, a lot less exciting, than what i quoted?
0Alsadius9yIt was mocking, but I think basically accurate. Light has wavelengths and sound has pressure waves whether we're there to observe them or not. Even if you want to argue that it's not really "colour" or "sound", who cares? The only part of it that's actually created by our consciousness is the conscious appreciation for what already exists.
2faul_sname9yI recall the realization that consciousness was a physical process as being much more exciting than the realization that I only experience my conscious thoughts.
0[anonymous]9yNo, I was talking about a different realization.
5[anonymous]9yDepending on what 'conscious perceptions' is supposed to mean that's either completely the opposite of astounding or just false. Though on second thought I suppose the realization that perception is a physical process occurring inside your skull is exciting in a certain sense!
-2chaosmosis9yNo, it's implicit the way it is. Perception entails consciousness, unless you have a weird definition of perception.

Look up blindsight. Perception without conscious processing.

2[anonymous]9yA definition of perception according to which insects have it doesn't sound too weird to me.
2[anonymous]9yThis could be checked giving a pinprick to someone in a coma while scanning their brain. (And I'm being generous and not assuming a definition of consciousness which excludes healthy people in deep dreamless sleep, because if perception entailed consciousness in that sense, alarm clocks would work much less reliably.)
7sixes_and_sevens9yNot as much as "physics is wrong because our brains are magical because I say so", but yes, it does. That our perceptions are a process that involves our consciousness (however you want to define that) is, technically, a fact about the universe, but only in the same way "my cat's breath smells like cat food" is a fact about the universe. The perceptions fact (insofar as it is a fact) is more of a statement about human cognition than it is about the universe. It may very well be the statement he finds most astounding about the universe, but it's not the core of the paradigm-busting central theory of everything he purports it to be.