Another issue with teaching it academically is that academic thought, like I already said, frames things in a mathematical and thus non-human way. And treating people like objects to be manipulated for certain goals (a common consequence of this way of thinking) is not only bad taste, it makes the game of life less enjoyable.
Yes intuitions can be wrong welcome to reality. Beside I think schools are bad at teaching things.
If you want something to be part of you, then you simply need to come up with it yourself, it will be your own knowledge. Learning other peoples knowledge however, feels to me like consuming something foreign.
Yes the trick for that is to delete the piece of knowledge you learnt and ask the question, how could I have come up with this myself?
Of course, my defense of ancient wisdom so far has simply been to translate it into an academic language in which it makes sense. "Be like water" is street-smarts, and "adaptability is a core component of growth/improvement/fitness" is the book-smarts. But the "street-smarts" version is easier to teach, and now that I think about it, that's what the bible was for.
That just sounds to me like "we need wisdom because people cannot think" . Yes I would agree considering when you open reddit, twitter or any other platform you can find many biases being upvoted. I would agree memetic immune system is required for a person unaware of various background literature required to bootstrap rationality. I am not advocating for teaching anything I don't have plans for being an activist or having will to change society. But consider this, if you know enough rationality you can easily get past all that.
Sure a person should be aware when they're drifting from the crowd and not become a contrarian since reversed stupidity is not intelligence and if you dissent when you have overwhelming reason for it you're going to have enough problems in your life
I would agree on the latter part regarding good/evil. Unlike other rationalist this is why I don't have will to change society. Internet has killed my societal moral compass for good/evil however you may like to put it for being more egoistic. Good just carries a positive system 1 connotation for me, I am just emoting it, but I mostly focus on my life. Or you have to be brutally honest about it, I don't care about society as long as my interests are being fulfilled.
The actual truth value of beliefs have no psychological effects (proof: Otherwise we could use beliefs to measure the state of reality).
Agreed, map is not the territory, it feels same to be wrong as it feels to be right.
It's more likely for somebody to become rich making music if their goal is simply to make music and enjoy themselves, than if their goal is to become rich making music.
Yes if someone isn't passionate about such endeavours they may not have the will to sustain it. But if a person is totally apathetic to monetary concerns they're not going to make it either. So a person may argue on a meta level it's more optimal to be passionate about a field or choose a field you're passionate about in which you want to do better , to overcome akrasia and there might be some selection bias at play where a person who's good at something is likely to have positive feedback loop about the subject.
But the "Something to protect" link you sent seems to argue for this as well?
Yes, exactly, truth is in highest service to other goals if my phrasing of "highest instrumental value" wasn't clear. But you don't deliberately believe false things that's what rationality is all about, truth is nice to have but usefulness is everything.
Believing false things purposefully is impossible either ways, you're not anticipating it with high possibility. That's not how rationalist belief works. When you believe something that's how reality is to you, you look at the world through your beliefs.
How many great peoples autobiographies and life stories have you read?
Not many, but it would be unrepresentative to generalise from that.
But it's ultimately a projection, a worldview does not reveal the world, but rather than person with the worldview.
Ethically yes, epistemically no. Reality doesn't care, this is what society gets wrong, if I am disagreeing with your climate denial or climate catastrophism I am not proposing a what needs to be done, there is a divide between morals and epistemics.
"I define rationality as what's correct, so rationality can never be wrong, because that would mean you weren't being rational"
Yes, finally you get my point. We label those things rationality, the things which work. Virtue of empiricism. Rationality is about having cognitive algorithms which have higher returns systematically on whatever is that thing you want.
maps of the territory are inherently limited (and I can prove this)
I would disagree, physics is more accurate than intuitive world models. The act of guessing a hypothesis is reverse engineering experience so to speak, you get a causal model which is connected to you in form of anticipations (this link is part of a sequence so there's a chance there's lot of background info).
When you experience something your brain forms various models of it, and you look at the world through your beliefs.
You're optimizing for "Optimization power over reality / a more reliable map", while I'm optimizing for "Biological health, psychological well-being and enjoyment of existence".
And they do not seem to have as much in common as rationalists believe
That's misrepresentation of my position I said truth is my highest instrumental value not highest terminal value. Besides good portion of hardcore rationalists tend to have something to protect, a humanistic cause, which they devote themselves to, that tends to be aligned with their terminal values however they may see fit. Others may solely focus on their own interests like health,life and wellbeing.
To reiterate, you only seek truth as much as it allows you to get what you want but you don't believe in falsities. That's it.
But if rationality in the end worships reality and nature, that's quite interesting, because that puts it in the same boat as Taoism and myself. Some people even put Nature=God.
Rationality doesn't necessarily have nature as a terminal value, rationality is a tool, the set of cognitive algorithms which work for whatever you want with truth being highest instrumental value. As you might have read in the something to protect article.
Rationalists tend to have heavy respect for cognitive algorithms which allow us to systematically get us what we desire. They're disturbed if there's a violation in the process which gets us there.
Finally, if my goal is being a good programmer, then a million factors will matter, including my mood, how much I sleep, how much I enjoy programming, and so on. But somebody who naively optimizes for progamming skills might practice at the cost of mood, sleep, and enjoyment, and thus ultimately end up with a mediocre result. So in this case, a heuristic like "Take care of your health and try to enjoy your life" might not lose out to a rat-race like mentality in performance. Meta-level knowledge might help here, but I still don't think it's enough. and the tendency to dismiss things which seem unlikely, illogical or silly is not as great as a heuristic as one would think, perhaps because any beliefs which manage to stay alive despite being silly have something special about them.
None of that is incompatible with rationality, rather rationality will help you get there. Heuristics like "take care of your health and try to enjoy life" seem more of vague plans to fulfill your complex set of values which one may discover more about. Values are complex and there are various posts you can find here which may help you model yourself better and reach reflective equilibrium which is the best you can do either ways both epistemically and morally (former (epistemics) of which is much more easily reached by focusing on getting better with w.r.t. your values than focusing solely on it as highlighted by the post since truth is only instrumental) .
Edit: added some more links fixed some typos.
I even think it's a danger to be more "book smart" than "street smart" about social things.
Honestly I don't know enough about people to actually tell if that's really the case for me book smart becomes street smart when I make it truly a part of me.
That's how I live anyways. For me when you formalise streetsmart it becomes booksmarts to other people, and the latter is likely to yield better prediction aside from the places where you lack compute like in case of society where most of people don't use their brain outside of social/consensus reality. So maybe you're actually onto something here, along the lines of "don't tell them the truth because they cannot handle it" lol.
Astrology is wrong and unscientific, but I can see why it would originate. It's a kind of pattern reocgnition gone awry.
Well since I wanted to dismantle the chesterton fence I did reach the similar conclusions as yours regarding why it came to be and why they (the ancients) fell for it, the correlation causation one is the general purpose one. One major reason was agriculture where it was likely to work well due to common cause of seasons and relative star movement. So it can also be thought of as faulty generalisation.
If you had used astrology yourself, it might have ended better, as you'd be likely to intrepret what you wanted to to be true, and since your belief that your goal in life was fated to come true would help against the periodic doubt that people face in life.
That's false I wouldn't have socially demotivated my mom using apathy from wasting too much money on astrology, if I had been enthusiastic about it it would have fueled into her desire. Astrology is like that false hope of lottery, waste of emotional energies.
I would have been likely to fall for other delusions surrounding astrology instead of spending that time learning about things for example going on to pilgrimage for few weeks before exams etc .
Besides astrology predicts everything on the list of usual human behavior and more or less ends up predicting nothing.
Lastly, "systematic optimality" seems to suffer from something like Goodhart's law. When you optimize for one variable, you may harm 100 other variables slightly without realizing it (paperclip optimizers seem like the mathematical limit of this idea). Holistic perspectives tend to go wrong less often.
Well more or less rational is w.r.t. to cognitive algorithms, you tend to have one variable, achieving goals. And cognitive algorithms which are better at reaching certain goals are more rational w.r.t. to that goal.
There is a distinction made better truth oriented epistemic rationality and day-to-day life goal oriented instrumental rationality but for me they're pretty similar that for epistemic the goal is truth.
I think the distinction was made because there's significant amount of epistemics in rationality.
If your goal is optimising 100 variables then go with it. For a rationalist truth tends to be their highest instrumental value, that's the main difference imo between a rationalist or say a post-rationalist or a pre-rationalist. They can have other terminal values above that like life,liberty and pursuit of happiness etc.
If you're not aware with the difference between terminal and instrumental.
I'm personally glad that people who chase money or fame above all end up feeling empty, for you might as well just replace humanity with robots if you care so little for experiencing what life has to offer.
I think it again depends on value being 2 place function. Some people may find fulfillment from that. I have met some of them who're like that. I think quite a bit of literature on the topic is a bit biased in favour of common morality.
But why did Nikola Tesla's intelligence not prevent him from dying poor and lonely? Why was Einstein so awkward? Why do some many intelligent people not enjoy life very much?
I think you would need to provide evidence for such claims, my prior is set against such claims given the low amount of evidence I have encountered and I cannot update it just because some cultural wisdom said so, because cultural wisdom is often wrong.
For reference, I used to think rationally, I hated the world, I hated people, I couldn't make friends, I couldn't understand myself.
Then you weren't thinking rationally. To quote;
If you say, “It’s (epistemically) rational for me to believe X, but the truth is Y,” then you are probably using the word “rational” to mean something other than what I have in mind. (E.g., “rationality” should be consistent under reflection—“rationally” looking at the evidence, and “rationally” considering how your mind processes the evidence, shouldn’t lead to two different conclusions.)Similarly, if you find yourself saying, “The (instrumentally) rational thing for me to do is X, but the right thing for me to do is Y,” then you are almost certainly using some other meaning for the word “rational” or the word “right.” I use the term “rationality” normatively, to pick out desirable patterns of thought.
Also check firewalling the rational from the optimal and feeling rational .
You may even learn something about rationality from the experience, if you are already far enough grown in your Art to say, "I must have had the wrong conception of rationality," and not, "Look at how rationality gave me the wrong answer!
Also check no one can exempt you from laws of rationality.
And I disagree with a few of the moral rules because they decrease my performance in life by making me help society. Finally, my value system is what I like, not what is mathematically optimal for some metric which people think could help society experience less negative emotions (I don't even think this is true or desirable)
Well then you can be mathematically optimal for the other metric. Laws of decision theory don't stop working if you change your utility function. Unless you want to get money pumped lol , in that case your preferences are circular. Yes you might argue that we're not knowledgeable enough to figure out what our values will be in various subject areas, and there's a reason we have an entire field of AI alignment due to various such issues, and there are various problems with inferring our desires, limits of introspection.
There's an entire field of psychology, yes, but most men are still confused by women saying "it's fine" when they are clearly annoyed. Another thing is women dressing up because they want attention from specific men. Dressing up in a sexy manner is not a free ticket for any man to harass them, but socially inept men will say "they were asking for it" because the whole concept of selection and stardards doesn't occur to them in that context. And have you read Niccolò Machiavelli's "The Prince"? It predates psychology, but it is psychology, and it's no worse than modern books on office politics and such, as far as I can tell. Some things just aren't improving over time.
I think majority of people aren't aware of psychology and various fields under it. Ethics and decision theory kind of give a lot of clarity into such decisions when you analyse the payoff matrix. I haven't The prince but have read excerpts from it in self-improvement related diaspora, I am not denying the value which such literature gives us, I just think we should move on by learning from it and developing on top in light of newer methods.
Beside I am more of a moral anti-realist so lol. I don't think there is universally compelling arguments for these ethical things, but people with enough common psychological and culture grounds can cooperate.
Modern hard sciences like mathematics are too inhuman (autistic people are worse at socializing because they're more logical and objective).
Well it depends on your definition of inhuman, my_inhuman =/= your_inhuman value is a two place function, my peers when I was in high school found at least one of the hard sciences fun. Like them I find hard sciences pretty cool to learn about for fulfilling my other goals.
And modern soft sciences are frankly pathetic
Agreed. Some fields under psychology as pathetic. But the fields like cognitive biases etc are not.
and that it hasn't failed you much in the social parts?
Well astrology has clearly failed me, my mom often had these luddite-adjacent ideas about what I am meant to do in life because her entire source of ethics was astrology. Astrology in career advice is like rolling a dice and assigning the all the well known professions to a number rather than actual life satisfaction or value fulfillment.
"The rules of the brain are different than those of math, if you treat the brain like it's supposed to be rational, you will always find it to be malfunctioning for reasons that you don't understand"
I would strongly disagree on the front of intelligence . More Rational as in cognitive algorithms which tend to lead to systematic optimality in this case truth seeking/achieving goals is indeed possible and pretty much is a part of growth.
I would weakly disagree on the front of Internal family subsystems (with the internal double crux special case being extremely useful) and other introspective reductionist methods where you break down your emotional responses and process into parts and understand what you like/dislike and the various attempts to bridge the two. On this front there are plethora of competing theories due to easy problem of consciousness and trying to understand experience functionally.
And for brain not working as I want to be when I model other parts of this brain, I find it being emotionally engaged in things which aren't optimal for some of my goals and it isn't contradictory with rationality to acknowledge or deal with these feelings.
I was praising goggins because he's more of the type who is willing to fight himself and in more than half of the introspective models that without acknowledgement is bordering on self-harm. I find his strategy to be intuitively much better lol.
Where I would agree is that if you don't understand something then your theory is probably wrong. There are not confusing facts only models which are confused by facts.
Too many geniuses have failed at living good lives for me to believe that intelligence is enough.
I think growth is important, I like to think of it in intelligence being compute power and growth and learning being more of changing algorithms. Besides there is a good amount of coorelations with IQ you might want to look into, I think this area is very contentious (got a system1 response to check for the social norms due to past bans lol) , but we're on lesswrong, so you can continue.
This might be why I have the courage to criticize science on LW in the first place.
You're welcome, maybe you should read sequence highlights to get introduced with LW's POV to understand other people's positions here.
Science hasn't increased our social skills nor our understanding of ourselves, modern wisdom and life advice is not better than it was 2000 years ago.
Hard disagree, there's an entire field of psychology, decision theory and ethics using reflective equilibrium in light of science.
Ancient wisdom can fail, but it's quite trivial for me to find examples in which common sense can go terribly wrong. It's hard to fool-proof anything, be it technology or wisdom.
Well some things go wrong more often than other things, wisdom goes wrong a lot of time, it isn't immune to memetic selection, there is not much mechanism to prevent you from falling for false memes. Technology after one point goes wrong wayyy less. A biology textbook is much more likely to be accurate and better on medical advice than a ayurvedic textbook.
The whole "Be like water" thing is just flexibility/adaptability
Yes it's a metaphor for adaptiveness, but I don't understand where it may apply other than being a nice way to say "be more adaptive". It's like logical model like maths but for adaptiveness you import the idea of water-like adaptiveness into situations.
As for that which is not connected to reality much (wisdom which doesn't seem to apply to reality), it's mostly just the axioms of human cognition/nature.
You know what might be an axiom of human cognition? Bayes rule and other axioms in statistics. I have found that I can bypass a lot of wisdom by using these axioms where others are stuck without a proper model in real life due to ancient wisdom. Eg; I stopped taking ayurvedic medication which contained carcinogens; when people spend hours thinking about certain principles in ethics or decision theory I know the laws to prevent such confusion etc
If you're in a good mood then the external world will seem better too. A related quote is "As you think, so you shall become" which is oddly simiar to the idea of cognitive behavioural therapy.
Honestly I agree with this part, I think this is the biggest weakness of rationalism. I think the failure to general purpose overcome akrasia is a failure of rationality. I find it hard to believe that there would be a person like david goggins but a rationalist. The obsession with accuracy doesn't play well with romanticism of motivation and self-overcoming, it's a battle you have to fight and figure out daily, and under the constraints of reality it becomes difficult.
I don't think I can actually deliberately believe in falsity it's probably going to end up in a belief in a belief rather than self deception.
Beside having false ungrounded beliefs are likely to not be utility maximising in the long run its a short term pleasure kind of thing.
Beliefs inform our actions and having false beliefs will lead to bad actions.
I would agree with the Chesterton fence argument but once you understand the reasons for the said belief's psychological nature than truthfulness holding onto to it is just rationalisation.
Ancient wisdom is more of it works until it doesn't kind of wisdom, you have heuristics which reach certain benign conclusions but then they fail miserably when they do.
Besides someone thought about such wisdom, and it's been downhill ever since with people forgetting it and reinventing it. Science on other hand progresses with each generation.
But when you do have a verdical gears level model on the other hand then you can be damn sure the thing will work.
Well that tweet can easily be interpreted as overconfidence for their own side, I don't know whether Vance would continue with being more of a rationalist and analyse his own side evenly.
I think the post was a deliberate attempt to overcome that psychology, the issue is you can get stuck in these loops of "trying to try" and convincing yourself that you did enough, this is tricky because it's very easy to rationalise this part for feeling comfort.
When you set up for winning v/s try to set up for winning.
The latter is much easier to do than the former, and former still implies chance of failure but you actually try to do your best rather than, try to try to do your best.
I think this sounds convoluted, maybe there is a much easier cognitive algorithm to overcome this tendency.
Honestly majority of the points presented here are not new and already been addressed in
https://www.lesswrong.com/rationality
or https://www.readthesequence.com/
I got into this conversation because I thought I would find something new here. As an egoist I am voluntarily leaving this conversation in disagreement because I have other things to do in life. Thank you for your time.