Thanks for your answer.
I actually recently had an opportunity to skim through the Working Group 2’s full report, so I know there is some stuff there. I hadn’t thought of looking at the WG3’s, for some reason, even though it’s probably more useful for my question, so thanks for suggesting it.
According to the few posts on the topic, not everyone here agree about the extent and implications of climate change, but, since LW is full of people more knowledgeable than me on most topics, including this one, I want to ask a question about climate change anyway:
Whatever are the implications of this change, we’re releasing a lot of CO2 into the atmosphere, and it can be expected to increase global temperatures. Now, what do we do? I mean, my question is, there are a lot of politicians and activists suggesting a lot of solutions, and I can’t make sense of which ones are are actually the most supported by the literature, or make more sense. And I have the same problem about the issues and drawbacks that are never mentioned but that would arise were we to implement those solutions.
So, my main contribution to LW seems to be asking other people to explain stuff to me (I hope you don’t mind :) ), but do you people have any ideas on the question?
I kind of don’t have an answer here, but I would say there are two things in what you’re asking: finding tutors for obscure subjects per se does seem a bit more difficult that it should or could be, so it’s definitely an interesting question, but I’d probably say, go look on Skillshare, or Brilliant, or just about any MOOCs. If anyone has better tips, I want to know :) But the second thing is, you say you want to learn Fermi estimations, or general-purpose maths. I’d definitely like to learn those myself, so I have the same problem, but it actually should be very easy: there are math textbooks for bachelor students, and Fermi estimations are basically formalized guesstimates. But since I have the same problem of not actually getting very good at those, there must be a roadblock there anyway. I guess your actual problem here is the same as mine: it’s so much easier to learn anything, even something easy, when there’s someone on your back to make sure you keep doing it. If that’s the problem… well I’m also looking for the answer. But I’m sure there are good commitment devices apart from Beeminder, which I haven’t found very helpful in this case. The best would probably be what you’re actually asking for, a tutor to make sure you keep learning, but I don’t know how to find that. Maybe ask a friend to keep an eye on your progress?
The more I think of it, the more it sounds like I’ve been falling into the valley of bad rationality, which I knew about but never realised I was in. Still, that doesn’t change my question much: what’s the way out?, or rather, how do I make getting out of it easier? There aren’t many resource on that specific topic, although a lot of things go very near it indeed.
Rationality and social anxiety
The last few times I wrote stuff here, it was to ask questions that were basically about myself, and therefore entirely uninteresting to anyone else here. But I think it has finally evolved into an actually interesting question about rationality. That’s not really a complex question, or even one I have nice answers for, but it’s one that really seems to have been overlooked here, even though it also seems like it should be central to getting more people on board with rationality. Also, of course, that’s something I’m personally struggling a lot with, to the point where I would say it’s the main barrier preventing me from being as skilled at applying rationality the way I would like.
Bay Area Rationality is a subculture ; and it seems to include a rather large proportion of some types of people, like people on the autism spectrum, and generally all manner of contrarian folks, who may not care too much about fitting in society at large and espousing commonly-held views. Neither of these statements should be at all surprising, but the reason I’m highlighting them is that most people aren’t like that, which isn’t that surprising either, but has interesting implications. My point is that most people, including myself (*), aren’t that comfortable being a contrarian when others aren’t (as if being a contrarian when everyone else is too was at all fun ;) ), most people aren’t really able to tolerate being looked down upon by a dining roomful of people after trying to suggest that shouting at each other over politics might not be the ultimate goal of a Christmas dinner, most people aren’t really able to stand up to a college teacher and even think about what would be a polite and productive way of explaining him he’s talking nonsense, (**), etc, etc.
So, that’s my question for us: how can we help our brains go to all the trouble of being a rationalist while being as socially anxious as most people (if not cripplingly overanxious, like I am)?
(*) Actually, I’d say for me particularly it’s been getting worse with time, as my mild-Asperger induced social awkwardness was partly rubbed off by the contact of other people.
(**) Obviously, this description of what "most people" do was based upon a mildly rigorous analysis conducted on a sample of one (1) individual. So feel free to discuss that point also, because it seems obviously true to me, so there must be something to be said against it ;)
This is the second shortform I’ve ever written, both in the space of a few days, and it’s also the second one to be me asking for advice about an issue I have. I guess I should start making more useful contributions next time ;)
Anyway, at least it won’t be a long post: I’ve reasons to believe that my neuroticism — as in, the Big Five personality trait of that name — is really super high. I also have reasons to believe that’s because I’m ‘on the spectrum’ or something, but that specific point is less relevant. What I’m interested in is, I could be happier with my life if I weren’t constantly angered by small things, and not being mad at everyone and everything would let me make better choices, avoiding opportunities for being depressed and brooding over stuff, thereby becoming even more depressed (etc., etc.) later on. However, it seems like everything I’ve seen on the topic was either not easily actionable advice, or, more often, wasn’t actually so much about neuroticism as about the mental illnesses that may go with it. Rationality has been, and still is, a huge help in dealing with that, as well as in understanding my own thought process more generally. In fact, that’s what I’ve found the most helpful to date. But, same thing, I’ve never seen anything here that was specifically about neuroticism, even though it seems like it could be useful to people in general — and to me as well :) Hence the question: what do other very neurotic folks here do to be happy in spite of it?
Question: How to be more rational without being more misanthropic? / Rationality in life and the workplace
This is actually a terrible title, but I couldn’t find a better one.
I’m currently in college, doing something I’m interested in, with good career prospects (basically, I’m in a very good business school, though my degree also has both a touch of polsci and of sustainability/energy and environment). However, even though I like what I’m studying, I’m a lot less comfortable with the vibes that go with it: neither most of management roles nor ‘warm fuzzies activism’ environmentalism (especially coupled with widespread political involvement) seem to be the most obvious place to find LW-style rationalists-in-training, and in fact it can sometimes (only rarely, I’ve got to admit) feel like I’m actually running against the tide by trying to be more rational, in a way that wouldn’t be true if I were in a STEM field.
However, I’d be really surprised if everyone here came strictly from programming, math, or physics backgrounds only, as rationalism certainly isn’t limited to any field, being mainly a sort of mental personal hygiene for the rationalist himself.
But I’m still getting concerned with how to do it in practice: the social pressures and work culture in management positions sound like they would probably make it harder to follow rationalism there. No, scratch that, of course it wouldn’t prevent me at all from keeping an eye on my own cognitive processes, at least once I’ve graduated and left the heavily politicized /huge social pressure college I’m currently in, but it would still mean some form of getting entangled into culture wars stuff, in ways that might mean it would be more difficult for me to use rationality to do a good job. And it might also make me mad at people. Yeah, I know, we (‘we-regular human beings who can’t stand the culture wars anymore’, not only ‘we-LW folks’) all feel like this, but I’m having trouble finding a rationalist answer to that.
Hence my two questions: rationality is mainly built for oneself, for some kind of self-improvement, and it couldn’t be otherwise, but how can it be used in context where there are other people around who don’t want too much of these weirdly rigorous and nerdy ways of dealing with things? And also, rationality is great to think about one’s worldview, as well as other people’s, and to have informed debates instead of bravery debates, etc. but it still is super hard to use all of that when a) the other person doesn’t want to depoliticize the issue, or think clearly about it, and b) they don‘t want you to say you’d rather not have this conversation, either: how do I deal with these situations without becoming mad at people?
N.B.: Until today, I had only ever commented on other people’s posts here, so tell me if I’ve made any kind of mistakes in writing this :)
That concept of flailing in this context seems very interesting, and it makes me wonder about some things.
In particular, I find it particularly helpful to me, because it explains how I often feel very anxious about some things when there are other people around who will share the burden of it with me, in a way that makes me practically unable to get anything done, while the same problems seem a lot simpler and less stressful when I am alone. But I don’t know if people actually feel like this in ordinary situations, like I do, or if it is because I have an overprotective family and exceedingly kind friends who spoil me :)
Assuming it is actually quite common, we could then say that a lot of problems are best solved by one person alone. And yet, it also makes much sense to say that problems are best solved by bringing together many different sources of insights and sources of information, and — for more obviously political problems — a lot of competing interests as well. In short, a lot of people who should be brought together.
So, there is, first, probably something to say about how flailing is similar to political signaling. Also, it would be interesting to expand on that concept of flailing to think about how and when we solve problems individually vs. in groups, and what kinds of groups: Do we flail more with people we are close to, making it easier to solve problems sensibly when discussing them with strangers? Will a culturally homogeneous group making a decision be subject to a lot of flailing or not?, etc., etc. Basically, I am trying to think about how flailing can be seen as another way to describe at least some forms of political signaling in political decision-making, and it looks like an interesting way of describing it. But it also looks like I am not able to think clearly about it myself for the moment, which is an interesting opportunity to post a comment and see if anyone has interesting takes on these kinds of things :)
I may be oversimplifying here, but if I wanted to sum up what being a maze is about in a few words, I would say it has to do with a kind of Goodhart law problem: in large organizations, it is hard for the top to get information, so everything is reduced to simple metrics, and we end up optimizing for them, eventually destroying everything else. If that problem of information really is the bulk of it, the fact that your plan does not really remove the need for indirect measurements of things is the big issue, and I am not sure of what could be done to solve that. In fact, I’m not sure we could do it completely in any context: You want to reward people who "Disengage entirely with mazes and traditional distortionary incentives, competitions and signals of all kinds", for mazes, something might be tried, but disengaging from status-seeking, attractive as it sounds, looks like saying "just disengage with cognitive biases", good goal if you know you won’t achieve it.
More broadly, the most reliable thing I can think of to reduce these communication problems — indeed the one thing we replaced with mazes — is a lot of social capital, which probably implies limiting oneself to small communities and small businesses. Also, much more social pressure, with all its problems. To an extent, that’s the point, but it’s also something we really would want to avoid doing too much. Or it may even be that we are mainly complaining about mazes because that’s what they are already doing. I wonder to what extent the need for social capital if we don’t want to have mazes might be compensated by the fact that, compared to an hypothetical pre-mazes era, communication costs are down by a massive amount nowadays.
It really feels like something that’s so obvious most people — including myself — have long forgotten it. The main argument is how, when you don’t voice what’s actually going on in your brain when you think about something, you’re likely to eventually stop thinking carefully about what it, and end up deluding yourself about it: I assume a lot of people here have thought of that already, but, as for myself, I enjoyed being reminded of it.
Actually, I’m the kind of person who had awful social skills, and I have been trying hard to upgrade to poor/ok-ish social skills, and I’ve been feeling weird about it, without realising it might be linked to the fact that I’m now much more ready, far too ready, in fact, to let myself be influenced by others because of not wanting to disappoint.