I read recently an article on charitable giving which mentioned how people split up their money among many different charities to, as they put it, "maximize the effect", even though someone with this goal should donate everything to the single highest-utility charity. And this seems a bit like the example you cited where, if blue cards came up randomly 75% of the time and red cards came up 25% of the time, people would bet on blue 75% of the time even though the optimal strategy is blue 100%. All this seems to come from concepts like "Don't put all your eggs in one basket", which is a good general rule for things like investing but can easily break down.
I find myself having to fight this rule for a lot of things, and one of them is beliefs. If all of my opinions are Eliezer-ish, I feel like I'm "putting all my eggs in one basket", and I need to "diversify".You use book recommendations as a reductio, but I remember reading about half the books on your recommended reading list, thinking "Does reading everything off of one guy's reading list make me a follower?" and then thinking "Eh, as soon as he stops recommending such good boo...
I tried to start a Hofstadter cult once. The first commandment was "Thou shalt follow the first commandment." The second commandment was "Thou shalt follow only those even-numbered commandments that do not exhort thee to follow themselves." I forget the other eight. Needless to say it didn't catch on.
You just didn't give it enough time. Remember, it always takes longer than you expect!
I find myself having to fight this rule for a lot of things, and one of them is beliefs. If all of my opinions are Eliezer-ish, I feel like I'm "putting all my eggs in one basket", and I need to "diversify"
See also Robin Hanson's post on Echo Chamber Confidence.
You use book recommendations as a reductio, but I remember reading about half the books on your recommended reading list, thinking "Does reading everything off of one guy's reading list make me a follower?"
I think of all the people who have ever recommended books to me, Eliezer has the most recommendations which I've actually followed. In most of my circle socials, I'm the "smart one", but I'm nowhere near as smart as Eliezer (or most other people on LessWrong, it seems). So I do admire EY a lot. I want to be as smart as he is, and so I try reading all the books he has read.
And it kills me, because I also remember his post about novice editors copying the surface behavior of master editors, without integrating the deep insight, and I know that by reading the same science fiction novels EY has read, I'm committing exactly the same sin. But I don't know what else I can do to try to improve myself.
how people split up their money among many different charities to, as they put it, "maximize the effect", even though someone with this goal should donate everything to the single highest-utility charity.
If I have complete or near-complete trust in the information available to me about the charity's utility, as well as its short-term sustainability, that seems like the right decision to make.
But if I don't - if I'm inclined to treat data on overhead and estimates of utility as very noisy sources of data, out of skepticism or experience - is it irrational to prefer several baskets?
Similarly with knowledge and following reading lists, ideologies and the like.
Specifically on this topic.
The expected number of eggs lost is least if you choose the best basket and put all your eggs in it, but because of diminishing returns, you're better off sacrificing a few eggs to reduce the variance. However, your charitable donations are such a drop in the ocean that the utility curve is locally pretty much flat, so you just optimise for maximum expected gain.
It follows from the assumption that you're not Bill Gates, don't have enough money to actually shift the marginal expected utilities of the charitable investment, and that charities themselves do not operate in an efficient market for expected utilons, so that the two top charities do not already have marginal expected utilities in perfect balance.
And that you care only about the benefits you confer, not the log of the benefits, or your ability to visualize someone benefited by your action, etc.
Consider scope insensitivity. The amount of "warm fuzzies" one gets from helping X numbers of individuals with a given problem does not scale even remotely linearly with X. Different actions to help with distinct problems, however, sum in a much closer to linear fashion (at least up to some point).
Ergo, "one person with clean water and another with a malaria net" feels intuitively like you're doing more than "two people with clean water".
But if I don't - if I'm inclined to treat data on overhead and estimates of utility as very noisy sources of data, out of skepticism or experience - is it irrational to prefer several baskets?
Very much so. Rational behavior is to maximize expected utility. When rational agents are risk-averse, they are risk-averse with respect to something that suffers from diminishing returns in utility, so that the possibility of negative surprises outweighs the possibility of positive surprises. "Time spent reading material from good sources" is a plausible example of something that has diminishing returns in utility so you want to spread it among baskets. Utility itself does not suffer from diminishing returns in utility. (Support to a charity might, but only if it's large relative to the charity. Or large relative to the things the charity might be doing to solve the problem it's trying to solve, I guess.)
PG runs a discussion site. He's using it as a sort of wide-flung net to catch worthy candidates for the "inner circle" - startup founders who get into his YC program - and is quite open about it (e.g. he explicitly says that YC submissions will among other things be judged on how well their authors are known as HC commenters and how worthy their comments have been judged to be). Why is it surprising that this creates a cult atmosphere of sorts?
Before Hacker News, PG was already famous in the relevant community for his essays, which are often credited, among other things, for the modern revival of interest in Lisp (this is probably an exaggeration). Nobody called him a cult leader back then.
Joel Spolsky is a famous blogger in the programming/CS/IT niche; he has an active discussion forum on his site. Lots of people respect him, lots of other people look down on his posts. Nobody calls him a cult leader.
RMS doesn't even have a discussion forum, and doesn't write a blog. He browses the web through an email-mediated wget; that's not even Web 1.0, it's Web -0.5 or something. He's widely considered to be a cult leader.
I'd guess that to make people think you're behaving like a c...
Are you aware of the irony in saying Eliezer "won't shut up" about a topic he has demanded everybody shut up about?
I am. I view it as evidence that he recognizes the filtering effect these topics have brought to OB, and intends LW to build a community diverse and independent enough to not let itself be dominated by these topics, unless it so chooses. It's a smart decision.
The "top" page is already entirely dependent on post score. I'd strongly prefer that there stay some kind of editorial filter on some aspect of LW; we're doing great right now as a community, but many online communities start out high-quality and then change as their increased popularity changes the crowd and the content.
I don't know if that ever happened, and I didn't mean to imply he had been. Suppose someone tells you that you've been acting like a cult leader. Even if you don't agree with the claim, you've just obtained a convenient meta-explanation of why people disagree with you: they're consciously standing up to the cult that isn't there; they're being extra contrarian on purpose to affirm their cherished independence. What I was trying to say is that it's generally dangerous to adopt this meta-explanation; you're better off refusing to employ it altogether or at least guard its use with very stringent empirical criteria.
So... just for the record... this post got up to #1 on HN, and then HN crashed, and is, so far as I can tell, still down a couple of hours later.
When you consider that the Less Wrong site format was inspired by HN, that LW is based on Reddit source code, and that Reddit is a Y Combinator company, I guess that writing about Paul Graham and then getting voted up on Hacker News exceeded the maximum recursion depth of the blogosphere.
This would be an excellent time for a "stack overflow" joke, if only Spolsky could be worked in somehow.
And here you are commenting on HN going down, and here's the guy who submitted this to HN replying to your comment.
"I guess that writing about Paul Graham and then getting voted up on Hacker News exceeded the maximum recursion depth of the blogosphere."
Just wait until PG writes an essay about all this...
Picture of Eliezer in monk's robes (That is you, right?), stories about freemason-esque rituals, specific vocabulary with terms like, "the Bayesian conspiracy".
It's all tongue in cheek, and I enjoy it. But if you're trying to not look like a cult, then you're doing it wrong.
It seems to me that only a few groups get the label "cultish", so its not like people put the label on any group with an apparent leader. Such selective labels probably contain a lot of info, so it seems worth figuring out just what that info is. It is not wise to just find one group that gets the label which you think is fine, and then decide to ignore the label.
The straightforward approach would be to collect a dataset of groups, described by various characteristics, including how often folks call them "cultish." Then one would be in a position to figure out what this label actually indicates about a group.
I find myself moved to break possibly the greatest taboo amongst our kind, but if this act of status suicide moves just one reader to action, the sacrifice is worth it.
OK, here goes...
"Gödel, Escher, Bach" by Douglas R. Hofstadter is the most awesome book that I have ever read.
Me too!
This whole concept is confusing to me. I enjoy Eliezer's writing because it makes sense and is useful so it becomes part of my identity. I haven't found as many of his newer posts to be useful so a lower number of them are drafted into my identity. My 'self' is largely a collection of ideas and thoughts transmitted to me from other people and I don't find anything wrong with this. I do hope to produce useful knowledge myself but for right now I am educating myself to that point.
If I find a useful tool lying on the ground then I pick it up and use it, I do not try to recreate the tool from scratch in order to make it 'mine', which I feel is a meaningless concept. As long as my beliefs and skills pay for themselves in terms of useful benefits to my life I don't see the point in throwing them away because they came from someone else. I don't care who I am and I am not attached to any specific view of my self other than to try to pick the most effective tools to accomplish some core goals and values.
IAWYC but I think you forgot to include something about jealousy in your analysis, even if few people would admit it's part of it.
I think it's very possible to greatly admire someone and at the same time feel some form of jealousy that inhibits the clear expression of that admiration. By saying that someone else is better (much better) than you are - especially at something that you value - you are in effect admitting to a lower status.
So all the forced disagreements and claims of independence are in effect just trying to signal that your status is high and you're not submissive, or something like that.
When you read someones writings or follow the things they do but don't actually KNOW them, it's very easy to get sucked into a sort of 'larger-than-life' belief about them.
Because they're famous (and they must be famous because you've heard of them), they're obviously different and special and above regular, normal people. I've found it takes conscious effort to remember that no how famous or smart or talented they are, in the end they're just some guy or girl, with the same flaws as everyone else.
And when you think someone's larger-than-life, it's easy to praise highly, because you're not thinking of them as a normal person, you're thinking of them as ABOVE normal people. That they are special. In light of this, it's easy to see why praise for someone or something, no matter what it is, can be seen as cultish., and how you can fall into the trap of believing praise for anything is cultish.
Regarding this, it's really helpful when Eliezer mentions that he borrowed this or that part of his philosophy from a piece of anime fanfiction. It helps humanize him, or worse.
For what it's worth I don't think you've deliberately set out to become a "cult leader" -- you seem like a sincere person who just happens to be going about life in a rather nonstandard fashion. You've got some issues with unacknowledged privilege and such, and I've gotten impressions from you of an undercritical attractance to power and people who have power, but that's hardly unique.
I think mostly it's that you confuse people via sending off a lot of signals they don't expect -- like they think you must have some weird ulterior motive for not having gone to college, and instead of seeing public discussion of your own intellect as merely the result of somewhat atypical social skills, it's seen as inexcusable arrogance.
That said, because of my own negative experience(s) with people who've seemed, shall I say, rather "sparkly" at first, but who HAVE turned out to be seeking puppy-dog supplicants (or worse), I tend to be very very cautious these days when I encounter someone who seems to attract a fan club.
With you I've gone back and forth in my head many times as to whether you are what you first struck me as (a sincere, if a bit arrogant, highly ambitious guy) ...
Well, for one thing, privilege is a major source of bias, and when a person doesn't even realize they (or those they admire) have particular types/levels of privilege, they're going to have a harder time seeing reality accurately.
E.g., when I was younger, I used to think that racism didn't exist anymore (that it had been vanquished by Martin Luther King, or something, before I was even born) and didn't affect anyone, and that if someone didn't have a job, they were probably just lazy. Learning about my own areas of privilege made it possible for me to see that things were a lot more complicated than that.
Of course it's possible for people to go too far the other way, and end up totally discounting individual effort and ability, but that would fall under the category of "reversed stupidity" and hence isn't what I'm advocating.
(And that's all I'm going to say in this thread for now - need to spend some more time languaging my thoughts on this subject.)
IAWY, and I actually already replied to your question about this in a comment, but:
One of the prime issues for me as a rationalist trying to learn about marketing (especially direct/internet marketing) was having to get over the fear of being a "dupe" pulled into a "scam" and "cult" situation. Essentially, if you have learned that some group you scorn (e.g. "suckers" or "fools" or whatever you call them) exhibit joining behavior, then you will compulsively avoid that behavior yourself.
I got over it, of course, but you have to actually be self-aware enough to realize that you chose this attitude/behavior for yourself... although it usually happens at a young enough age and under stressful enough conditions that you weren't thinking very clearly at the time.
But once you've examined the actual evidence used, it's possible to let go of the judgments involved, and then the feelings go away.
In other words, persons who have this issue (like me, before) have had one or more negative social experiences linking these behaviors to a disidentified group -- a group the person views negatively and doesn't want to be a part of. It's a powerfu...
Alright! a few points that I can sort of disagree on or feel were omitted in the essay. I'm being skeptical, not a cultist at all! .
My fears aren't really that you're trying to foster a cult, or that it's cultish to agree with you. I got worried when you said that you wanted more people to vocalize their agreement with you and actually work towards having a unified rationalist front. For some reason, I had this mental picture of you as a supervillain declaring your intention to take over the world. So I reflected that I was doing things, somewhat unco...
I agree with your conclusion, and I love your library allegory. It's pretty clear that America fears strong emotions in general, and also that "our type" learns cached patterns of ritually approved of nonconformity.
That said, some may be balking, not at admiring someone hugely, but at forming nearly their entire manner of evaluating ideas from a single person, without independent sources of evidence that can label that person "trustworthy". Anne Corwin reports fearing networks of abstractions that distance people from their own concre...
"But upon reflection, I strongly suspect that I would feel no barrier to praising Gödel, Escher, Bach even if I weren't doing anything much interesting with my life."
You don't feel yourself to be in status competition with Hofstadter do you? Or E.T. Jaynes, for that matter. Think about effusively praising Nick Bostrom as the last best hope for the survival of humane values, instead.
"I'm hoping in particular that someone used to feel this way - shutting down an impulse to praise someone else highly, or feeling that it was cultish to praise so...
Figured, since this was linked to again, that I might as well say some of what I think on this:
My reaction is more, well, a couple of things, but part of it could be described like this: Yes, I do indeed admire you and think you're cool... and my natural instinctive reaction to you is kinda, well, fanboyish, I guess. Hence I try to moderate that... TO AVOID BEING ANNOYING... that is, to avoid, say, annoying you, for instance.
If you can do that quietly without anyone noticing, you're doing it right. If you make a big deal out of it to prove something to other people, you're doing it wrong. Should be obvious, really.
If you have 'teachings' rather than suggestions or opinions, and you can't support those claims in a systematic and explicit way, then it doesn't much matter whether you intended to propagate a cult - that's precisely what you're doing.
I'm afraid to read GEB now. It's been built up so high the only possible reactions I could possibly have are "as good as everybody else thinks it is", or "didn't live up to expectations", with the latter being far more likely.
Let me try to help you. Many people who praise GEB in the highest terms and recommend that everyone read it never finished it. Many read all the dialogues, but only some of the chapters. I have absolutely no data to support turning either of the previous "many" to "most", but wouldn't be surprised by either possibility.
GEB's most important strength, by far, is in giving you a diverse set of metaphors, thought-patterns, paradoxes and ways to resolve them, unexpected connections between heretofore different domains. It enlarges your mental vocabulary - quite forcefully and wonderfully if you haven't encountered these ideas and metaphors before. It's like a very, very entertaining and funny dictionary of ideas.
The exposition of various topics in theory of computation, AI, etc. that it also contains is not as important by comparison, and isn't the best introduction to these topics (it's still good and may well be very enjoyable, depending on your background and interest).
So there's no reason to fear reading GEB. You'll chuckle with recognition at the jokes, metaphors, notions that you've already learned elsewhere, and will be delighted at those you've never seen before. Read all the dialogues; if some of the chapters bore you, resist guilt tripping and skip a few - you'll come back to them later if you need them.
IMO being accused of wanting to be a cult leader is a pure double bind. You either say "yes, I do" and then you're a cult leader, or you say "what? that's crazy because of X, Y, Z..." and then people point at your protestations as evidence that their arguments have some minimal credibility (I am sure someone will do this to EY at some point). It is, prima facie, evident to me that talking to people on the internet about rationality is a poor method of getting acolytes (and even if it were a good one for some people, the Objectivists alr...
If there's an aspect of the whole thing that annoys me, it's that it's hard to get that innocence back, once you even start thinking about whether you're independent of someone.
Cross-referencing my comment on a different post for a related idea:
...Your brain remembers which "simple" predictor best described your decision [. . .]
Your brain learns to predict other peoples' judgments by learning which systems of predictive categories other people count as "natural". If you have to predict other peoples' judgments a lot, your brain starts
Ok, I'm coming out and will admit that I admire you, Eliezer very highly. I think you are the one who taught me the most about rationality and what intelligence is all about.
Now, I admit that in my past I have fallen into the "adore the guru" trap so I still have this fear in my head and am cautious to not do the same mistake again. The cult-threads here are helping me to evaluate my position carefully.
But I like what you wrote about that innocence of being able to experience real admiration and excitement. I think if you let your critical think...
You've changed my beliefs and thinking more than anyone outside my family, by a pretty huge margin. This makes me far more likely to raise something to the level of being worth paying attention to just because you've recommended it (as it should), but it also makes me careful on a gut level every time I'm consider adopting yet another belief from you.
I think this is partly because of what you describe in this post, but partly because I know a lot of the existing beliefs I have that will be inclining me to accept the new belief came themselves from you. I'm...
my theory about myself:
I don't think people will believe me if they recognize my views as the typical LW-cluster views. They'll just dismiss them.
Which is really rational of them, actually. I think I use the same heuristic. Once I see that someone's beliefs come from a political affiliation, they're weaker evidence to me.
Like... if someone's trying to convince me out of global warming, but then I learn that she's also against affirmative action and immigration and regulation on finance. At first I might have thought she read convincing scientific arguments...
Word "cult" seems to be used in very vague sense by everyone, and people have different definitions. Here's something I wrote about Paul Graham's and a few other "cults". It's only vaguely relevant, as I used the label "cult" differently.
If you are not into Paul Graham's cult / meme complex, and you hear people who really are - talking how working 100 hours a week on built to sell startup is the best way to prove your worth as a hacker and a human being - they really sound like "cult" members.
Explanation: Emotional overexcitability, a trait common to gifted people and yes there is good reason to believe that most LessWrongers are gifted may cause LW and Hackernews fans to be extra excitable and intense. You've probably heard that gifted people tend to be more emotional? Well on your LessWrong survey your respondents claimed an average IQ in the 140s, well beyond the minimums for all the IQ definitions for gifted. If these readers are unusually emotionally intense, as gifted people tend to be, it's likely their unusual "electricity"...
(note: Hofstadter does not have a cult)
Douglas Hofstadter's research group is apparently quite cultish. It's close-knit, dominated by a single person, is not tolerant of disagreement, and has little intellectual interaction with the remainder of the field.
This doesn't make GEB less than excellent. It merely partially explains why they haven't made much progress since.
My personal test for whether you're my "cult leader" or just a good teacher, is how I react when I think you're wrong. If they are merely a teacher, then I will sit down and work out exactly why they're right from base principles, and I'll admit it if I'm confused or if I think they are genuinely wrong. Given how many times in the sequences I've spent a few hours working things out, I feel safe here.
A good teacher says "here is something worth understanding" rather than "here is the teacher's password" - it is a willingness to...
I know the opposite of stupidity is still stupidity, but every time I see some idiotic attempt to gain status by pointing out how "everone else except me seems to revere Eleizer too much" I have to restrain myself from reacting in the other direction and worshiping the guy.
I used to have the idea that finding flaws in something (a piece of writing or entertainment or an idea or a person) made me better than the person or the creator of the thing I was criticizing. Then I realized two things which got me to stop: 1) Critics are parasites; they don't generally produce anything that valuable and entertaining themselves, and even beautifully written reviews are pretty low on my list of things to read for edification or fun. 2) When I go around finding flaws in everything, I stop enjoying it, and living a life where I can't enj...
Critics are parasites; they don't generally produce anything that valuable and entertaining themselves
Debunking mistaken hypotheses is just as important as coming up with new ones. Otherwise our heads would be so filled with confused theories that we could never develop the correct ones.
You've changed my beliefs and thinking more than anyone outside my family, by a pretty huge margin. This makes me far more likely to raise something to the level of being worth paying attention to just because you've recommended it (as it should), but it also makes me careful on a gut level every time I'm consider adopting yet another belief from you.
I think this is partly because of what you describe in this post, but partly because I know a lot of the existing beliefs I have that will be inclining me to accept the new belief came themselves from you. I'm...
You've changed my beliefs and thinking more than anyone outside my family, by a pretty huge margin. This makes me far more likely to raise something to the level of being worth paying attention to just because you've recommended it (as it should), but it also makes me careful on a gut level every time I'm consider adopting yet another belief from you.
I think this is partly because of what you describe in this post, but partly because I know a lot of the existing beliefs I have that will be inclining me to accept the new belief came themselves from you. I'm...
Leaving aside the valid points about overrating particular experts, when you have limited exposure to opposing viewpoints on the subject matter; cult-like behavior doesn't even require an intentional cult leader. Paul Graham doesn't have to willfully cultivate that type of following, for some of it to arise spontaneously as a function of the social structures and participants around him.
Frequently agreeing with someone who has a lot of good ideas, and who also has high status in a community that you're a member of, is not inherently bad. But once you ...
Followup to: Why Our Kind Can't Cooperate, Cultish Countercultishness
I used to be a lot more worried that I was a cult leader before I started reading Hacker News. (WARNING: Do not click that link if you do not want another addictive Internet habit.)
From time to time, on a mailing list or IRC channel or blog which I ran, someone would start talking about "cults" and "echo chambers" and "coteries". And it was a scary accusation, because no matter what kind of epistemic hygeine I try to practice myself, I can't look into other people's minds. I don't know if my long-time readers are agreeing with me because I'm making sense, or because I've developed creepy mind-control powers. My readers are drawn from the nonconformist crowd—the atheist/libertarian/technophile/sf-reader/Silicon-Valley/early-adopter cluster—and so they certainly wouldn't admit to worshipping me even if they were.
And then I ran into Hacker News, where accusations in exactly the same tone were aimed at the site owner, Paul Graham.
Hold on. Paul Graham gets the same flak I do?
I've never heard of Paul Graham saying or doing a single thing that smacks of cultishness. Not one.
He just wrote some great essays (that appeal especially to the nonconformist crowd), and started an online forum where some people who liked those essays hang out (among others who just wandered into that corner of the Internet).
So when I read someone:
...well, that outright broke my suspension of disbelief.
Something is going on here which has more to do with the behavior of nonconformists in packs than whether or not you can make a plausible case for cultishness or even cultishness risk factors.
But there are aspects of this phenomenon that I don't understand, because I'm not feeling what they're feeling.
Behold the following, which is my true opinion:
"Gödel, Escher, Bach" by Douglas R. Hofstadter is the most awesome book that I have ever read. If there is one book that emphasizes the tragedy of Death, it is this book, because it's terrible that so many people have died without reading it.
I know people who would never say anything like that, or even think it: admiring anything that much would mean they'd joined a cult (note: Hofstadter does not have a cult). And I'm pretty sure that this negative reaction to strong admiration is what's going on with Paul Graham and his essays, and I begin to suspect that not a single thing more is going on with me.
But I'm having trouble understanding this phenomenon, because I myself feel no barrier against admiring Gödel, Escher, Bach that highly.
In fact, I would say that by far the most cultish-looking behavior on Hacker News is people trying to show off how willing they are to disagree with Paul Graham. Let me try to explain how this feels when you're the target of it:
It's like going to a library, and when you walk in the doors, everyone looks at you, staring. Then you walk over to a certain row of bookcases—say, you're looking for books on writing—and at once several others, walking with stiff, exaggerated movements, select a different stack to read in. When you reach the bookshelves for Dewey decimal 808, there are several other people present, taking quick glances out of the corner of their eye while pretending not to look at you. You take out a copy of The Poem's Heartbeat: A Manual of Prosody.
At once one of the others present reaches toward a different bookcase and proclaims, "I'm not reading The Poem's Heartbeat! In fact, I'm not reading anything about poetry! I'm reading The Elements of Style, which is much more widely recommended by many mainstream writers." Another steps in your direction and nonchalantly takes out a second copy of The Poem's Heartbeat, saying, "I'm not reading this book just because you're reading it, you know; I think it's a genuinely good book, myself."
Meanwhile, a teenager who just happens to be there, glances over at the book. "Oh, poetry," he says.
"Not exactly," you say. "I just thought that if I knew more about how words sound—the rhythm—it might make me a better writer."
"Oh!" he says, "You're a writer?"
You pause, trying to calculate whether the term does you too much credit, and finally say, "Well, I have a lot of readers, so I must be a writer."
"I plan on being a writer," he says. "Got any tips?"
"Start writing now," you say immediately. "I once read that every writer has a million words of bad writing inside them, and you have to get it out before you can write anything good. Yes, one million. The sooner you start, the sooner you finish."
The teenager nods, looking very serious. "Any of these books," gesturing around, "that you'd recommend?"
"If you're interested in fiction, then definitely Jack Bickham's Scene and Structure," you say, "though I'm still struggling with the form myself. I need to get better at description."
"Thanks," he says, and takes a copy of Scene and Structure.
"Hold on!" says the holder of The Elements of Style in a tone of shock. "You're going to read that book just because he told you to?"
The teenager furrows his brow. "Well, sure."
There's an audible gasp, coming not just from the local stacks but from several other stacks nearby.
"Well," says the one who took the other copy of The Poem's Heartbeat, "of course you mean that you're taking into account his advice about which books to read, but really, you're perfectly capable of deciding for yourself which books to read, and would never allow yourself to be swayed by arguments without adequate support. Why, I bet you can think of several book recommendations that you've rejected, thus showing your independence. Certainly, you would never go so far as to lose yourself in following someone else's book recommendations—"
"What?" says the teenager.
If there's an aspect of the whole thing that annoys me, it's that it's hard to get that innocence back, once you even start thinking about whether you're independent of someone. I recently downvoted one of PG's comments on HN (for the first time—a respondent had pointed out that the comment was wrong, and it was). And I couldn't help thinking, "Gosh, I'm downvoting one of PG's comments"—no matter how silly that is in context—because the cached thought had been planted in my mind from reading other people arguing over whether or not HN was a "cult" and defending their own freedom to disagree with PG.
You know, there might be some other things that I admire highly besides Gödel, Escher, Bach, and I might or might not disagree with some things Douglas Hofstadter once said, but I'm not even going to list them, because GEB doesn't need that kind of moderation. It is okay for GEB to be awesome. In this world there are people who have created awesome things and it is okay to admire them highly! Let this Earth have at least a little of its pride!
I've been flipping through ideas that might explain the anti-admiration phenomenon. One of my first thoughts was that I evaluate my own potential so highly (rightly or wrongly is not relevant here) that praising Gödel, Escher, Bach to the stars doesn't feel like making myself inferior to Douglas Hofstadter. But upon reflection, I strongly suspect that I would feel no barrier to praising GEB even if I weren't doing anything much interesting with my life. There's some fear I don't feel, or some norm I haven't acquired.
So rather than guess any further, I'm going to turn this over to my readers. I'm hoping in particular that someone used to feel this way—shutting down an impulse to praise someone else highly, or feeling that it was cultish to praise someone else highly—and then had some kind of epiphany after which it felt, not allowed, but rather, quite normal.
Part of the sequence The Craft and the Community
Next post: "On Things that are Awesome"
Previous post: "Tolerate Tolerance"