by [anonymous]3 min read25th Jul 200916 comments

10

Personal Blog

We all have heroes or idols, people we look up to and turn to, in one form or another, for guidance or wisdom. Over the years, I've noticed that my feelings towards those I've idolized tend to follow a predictable pattern. The following is the extraction of this pattern.

Stage 1: Exposure - you're exposed to the idol through some channel. Maybe something you read, or someone you know, or simply by chance. You begin to learn about them, and you become intrigued. If it's an author, maybe you pick up one of his books. If it's a group, maybe you check out their website. You begin to gradually absorb what the idol is offering. They don't actually become an idol though, until...
Stage 2: Resonance - after enough exposure, what the idol offers begins to strike a chord with you. You go on to ravenously consume everything related to it. You track down every one of the authors publications, or spend hours staring at all of an artists' paintings. As far as I can tell what's important here isn't actually the content of what the idol offers, but that feeling of resonance it engenders.
Step 3: Incorporation - the idol has become one of the lenses through which you view the world. Everyone and everything is compared to the idol, and everyone invariable comes up short (raise your hand if you've ever thought about someone "Well, they're pretty smart, but not as smart as Eliezer"). You change your lifestyle to be more like them, to think more like them. It's as if they have all aspects of life figured out, and you follow along in the hopes that you will reach their same understanding. The most fervent support of the idol lies here.
Step 4: Backlash - you start to realize that the idol does not, in fact, provide the answers to all of life's questions. That they might be wrong about some things, or that their specific offering does not apply to all life's situations. You've changed yourself to emulate the idol, and you realize that perhaps not all those changes were for the better. Paradoxically, the blame for this gets placed on the idol instead of you. Feelings toward it shift from worship to antipathy.
Step 5: Re-incorporation - after enough time has been spent hating the idol, you come to realize, if only subconsciously, that the fault lies not in the idol, but in your worship. The cycle ends with a more reserved incorporation of what the idol does offer, along with the realization of what it doesn't. The idol ceases to be an idol, and becomes another earthly entity, complete with faults.
Of course, the sample size I'm working with is one - this may be a general feature of humans, or simply nuances of my individual brain. If I had to bet though, my money would lie with the general feature - it's happened to me many different times of the years, with my brain in many different stages of development. And I suspect I've seen others in various stages of this (though I have no way of knowing, really.)
Looking at the above stages, worship seems to mirror the progression of infectious disease. Exposure leads to an infection, which then spreads throughout the body. At this point, the body mounts a counterattack, and produces masses of white blood cells to fight off the infection. The infection is expunged, the white blood cells return to normal levels, and you're left with antibodies which contribute to a more complete immune system.
The problem with this isn't so much that it isn't rational per se - taking new evidence and updating our beliefs until they converge on the 'right' answer seems to be exactly the sort of thing we should be doing. The problem is how long it can take to get through them. In the past it's taken me years to get through step five after encountering something new, and true believers in something seem to reach step three and then just stay in it. But we should be updating our beliefs as quickly as possible, not languishing with the wrong answer for huge chunks of our lives.
So my question to the community is twofold:
1) Is this something that happens to you?
and
2) Assuming this is a basic mental process that can't be just turned off, how can we cycle through it faster, so we can more quickly reach accurate beliefs?
For my part, simply recognizing that this cycle exists seems like it's reduced both it's duration, and the extremes I swing to in each direction. But I'm curious if I can do better.

 

10

16 comments, sorted by Click to highlight new comments since: Today at 6:28 AM
New Comment

The general phenomenon you're noticing here is this: if you're trying to pick some real number x to maximize f(x), you'll start with some particular value of x and keep trying out different values. As long as f(x) seems to be increasing in x, you will increment x by larger and larger amounts. At some point, you shoot past the local maximum, and f(x) decreases. So now you start decrementing x, somewhat more slowly than you were previously incrementing it. Soon x becomes smaller than optimal, and you switch directions again. You'll gradually approach the local maximum, but you actually cross it many times.

[-][anonymous]13y 0

I second pjeby, your statement appears to be quite insightful.

If we take this to be an accurate summary... the best way I can think to how we might make the process more efficient or speedy is taking note that we are attempting to reach a maximum- which is probably why OP mentions that knowledge of the cycle helps.

I can't imagine any other way than just... knowing the maximum, which is hardly something to expect when learning new things.

That is brilliant. It even suggests a brain-hardware implementation that'd be consistent with PCT's generalized notion of learning.

I've had that happen. I also appear to have managed to short-circut the cycle - it's been a few years since I've had it happen. The key, as far as I can tell, was to realize that it's literally impossible for someone to have all the answers - the world is far too complex for any human to understand that well.

Stage one and two are still about the same now, but the other three stages blend together - I automatically look for the flaws and limitations of a worldview or body of work at the same time as I try to find ways that it's useful, and I only incorporate the parts that seem worthwhile. I've also found that I'm more open to unpopular or 'obviously wrong' worldviews now, in the sense that I suspect that there's at least a bit of usefulness to be found in any of them.

On the contrary. It's easy for someone to have all the answers. Heck, even my Magic Eight Ball has all the answers. It's having correct answers that's difficult. ;)

I'm fairly sure you know what I meant. ^.^

Indeed I did. I just couldn't resist the opportunity to be a smartass. ;)

1) Yes. I'm just over the hump of stage 3 now. Judging by my past life and the current first derivative, this stage will definitely end soon and transition into something like stage 4, so your description is spot on.

2) I don't know how to convert anticipation of future belief into current belief in general. This must prove I am Bayesian-irrational. But I do know how to speed up cycling through the stages: compress time. Read more, think more, solve more problems, have more experiences. Achieve critical mass to level up.

Is it really desirable to speed the process up? To quote Miracle Max, "You rush a miracle man, you get lousy miracles". My observation of human thought evolutions like the one described above indicates that rushing them usually does little good... perhaps it is sufficient to be aware of the process as it is happening. Observe, know you're going through the stages, have a sense of humor about it.

I've not seen this full cycle since late adolescence, and as Adelene mentioned sometimes the cycle short circuits. The short circuit for me usually occurs when I find the proto-idol empirically wrong on a thing or two, and we are all empirically wrong on a thing our two some of the time. Humanity is then re-endowed upon the proto-idol and I go on my way. Probably still admiring of the idol in some aspects, but not fully entranced and in a glamor anymore. He/she/it is human too.

2) Assuming this is a basic mental process that can't be just turned off, how can we cycle through it faster, so we can more quickly reach accurate beliefs?

Recognise your heros as Short Duration Personal Saviors, and adopt them deliberately.

[-][anonymous]13y 0

Thanks for those links. They help to flesh out the connection between this and framing your perceptions, something I'd like to explore more fully.

Stage two can be more generally described as overvaluing a topic/author in its importance for whatever purposes you are pursuing (including the "worship the object of fandom"). Stage three is what stage two looks like given time. Then, stage four is a "correction", reversing of stupidity of stage two, and stage five is "return to reason", when you realize that reversed stupidity is not intelligence.

This parallels a healthy exploration regime. When you are searching for something useful for some purpose, but don't expect to find a "packaged" solution, you look for cues (1). Finding a cue, you start understanding the topic, which requires learning at least superficially a lot of info (2). Given time and effort, you start understanding the topic, knowing your way around it (3). Having understood the topic, you find out that what you were looking for isn't there, and so stop being interested and forget most of what you learned shortly thereafter(4). Sometime in the future, the expertise you developed pays off, tipping off your intuition on a way to make use of the topic, and now, knowing where to look, you return and find a required answer (5). Here, effort is necessary to have any hope of finding the answer, so even low expectations lead to a lot of activity (stages 2-3).

I went through something like this when I read Jaynes's PT:LOS. The object of my idolatry was the Bayesian approach to data analysis. I don't think it would be accurate to say I ever had antipathy towards it, but I know now it's not the panacea I originally took it for. To a lesser extent, I idolized Jaynes. Again, I never hated him or his ideas, but I now have a better idea of where he went wrong.

1) Is this something that happens to you?

Yes.

2) Assuming this is a basic mental process that can't be just turned off, how can we cycle through it faster, so we can more quickly reach accurate beliefs?

In my experience: Cognitive enhancing drugs, cycles of stress and relaxation, the need to personally solve related problems and long distance running, particularly in a natural environment.

Your results may vary (mine do!)

Sounds like love to me.

New to LessWrong?