LessWrong developer, rationalist since the Overcoming Bias days. Connoisseur of jargon.
People often model new norms as a stag hunt – if only we all pitched in to create a new societal expectation, we'd reap benefits from our collective action.
I think this is wrong, because it restricts the scope of what counts as a "norm" to only cover things that affect misaligned components of peoples' utility functions. If a norm is the claim that some category of behavior is better than some other category of behavior according to a shared utility function with no game theoretic flavor to it, then anyone who fully understands the situation is already incentivized to follow the norm unilaterally, so it isn't a stag hunt.
This was a significant controversy among the moderators. I believed, and continue to believe, that the post should have stayed up.
I wasn't at this meetup, but it sounds to me like he was speaking to a large group of people who did not individually agree to any sort of confidentiality. I also think the bar for moderators removing this sort of thing is very high; removing it would send a very bad signal about our trustworthiness and the trustworthiness of our ecosystem in general.
(I'm a moderator, but speaking here as an individual.)
There's a commonly-hypothesized version of the feedback loop that has one more step: paying attention to the pain causes you to tense muscles in the area that don't need to be tense, and which hurt if they are. This mechanism implies that certain physical interventions will work (things which un-tense the muscles). It also de-mystifies what "paying attention to the pain" means, in a way that's more actionable and less psychology-flavored.
I've been writing a series of posts about nutrition, trying to consistently produce one post per day. The post I had in mind for today grew in scope by enough that I can't finish it in time, so this seems like an opportune day for a meta-post about the series.
My goal, in thinking and writing about nutrition, is to get the field unstuck. This means I'm interested in solving the central mysteries, and in calling attention to blind spots. I'm primarily writing for a sophisticated audience, and I'm making little to no attempt to cover the basics. I'm not going to do the sort of literature review that goes through all the vitamins and minerals in order, saying approximately the same things Wikipedia says about each of them. There are enough of those out there already. If you're just trying to figure out how to lose weight, then these posts will be interesting, they will probably give you a perspective that makes evaluating other sources a lot easier, but my posts will not be optimized for solving your problem directly.
I have the reliability-vs-generativity-tradeoff slider set all the way to "generativity". It would be very surprising if I finished this post series without saying anything untrue. I will not repeat this epistemic status on every post, but this epistemic status does apply to all of them.
Obesity and weight loss will come up a lot in my writing, because the obesity epidemic is this big conspicuous mystery that a lot of people have studied a lot about, and it's somewhat central and connected to other subtics within nutrition. But it's not really what I care about, except insofar as it affects productivity and general health.
I haven't been putting content warnings at the top of my posts. I'm going to start.
There's the obvious content warning, which is that some people with certain classes of eating disorders don't want to read about food or nutrition in general, or only want to read about it when they're at their best, because thinking about the topic makes them stress about the topic which makes them do dumb things. I think that the particular ideas I have to present are probably net-good for most such people, but they probably want to make a conscious choice about whether and when to read my posts, and I don't want them to have to unfollow me.
The second warning is that I'm making little to no effort to cover the basics, and by that I mean I'm not going to reliably provide the warnings-away-from-spectacular-failures that mainstream nutrition advice focuses on. If I imagine my posts in a grocery-store checkout aisle magazine, being read by average people, I think some of those people might die. So, watch out. If you manage to give yourself scurvy, this will be your own fault, and I will call you a scallywag.
Last piece of meta: I'm posting these as I go with minimal editing, but there will probably be a more polished second-pass version of some sort in the future. If you're curious about my nutrition thoughts but feel no urgency, then it might be worth waiting for it.
(Crossposted on Facebook)
Does it solve your use case if I edit prev/next links into all of them?
(For now I'm focused on keeping a writing cadence going, and not thinking too much about publication format. There's a decent chance that, after I've depleted the backlog of unpublished ideas I've had, I'll do a second pass of some sort and make it more polished; but I don't think that's certain enough that you should count on it.)
Our past beliefs affect what we pay attention to, how we prioritize our skepticism, and how we interpret ambiguous evidence. This can create belief basins, where there are multiple sets of beliefs that reinforce each other, appear internally consistent, and make it hard to see the other basins as valid possibilities. On the topic of nutrition, I seem to have found myself in a different basin. I've looked through every nonstandard lens I could find, repeatedly applied skepticism, and firmly committed to not make the same mistakes everyone else is making (as a priority on par with not making mistakes at all). I've arrived at a set of beliefs that, as far as I can tell, is internally consistent, reasonably compelling from the inside, and completely contrary to what most other people in our culture think.
This makes for a difficult writing project. When I try to argue nonstandard positions, many of the arguments are tendrils reaching into other nonstandard positions. I've finally managed to get into a post-every-day cadence; part of the key to that was accepting that sometimes those arguments will be dangling references. Hopefully after a month of this, the whole thing will cohere. If not, well, the fragments are pretty interesting too.
The most-common basin of nutrition theorizing centers on obesity, and on a particular theory of obesity which goes like this. Food, especially modern processed food, tastes good and is appealing. Some people, if they followed their urges, would eat too much or become obese, so they have to exert self control not to. Weight is a function of calorie intake and calorie expenditure ("calories in, calories out"), and expenditure is primarily a function of behavior. So if someone is trying to lose weight, and it isn't working, then they must be having willpower failures and eating more than they intend, or exercising less than they intend.
I currently think there are quite a lot of things wrong with this model, but today, I'd like to focus on one in particular. It's not the only or the most central objection, nor is it a particularly actionable model fragment. But it's an issue that's important to me in particular, and it's one of the wedges that moved into a different belief basin.
I am not obese, and have never set out to lose weight. But sometimes, I have overwhelming cravings for sugar. I would not be able to resist these cravings without great willpower.
If I ever did successfully resist one of those cravings, I would probably die.
I don't mean this figuratively, or in a heart-disease-years-later sort of way. I mean that if I get a powerful craving for sugar, and I don't promptly eat something that has sugar in it, then this will be a life-threatening medical emergency. This is because I have type 1 diabetes, and craving sugar is a symptom of low blood sugar, aka hypoglycemia. What T1 diabetes means, basically, is that I have to micromanage my blood sugar using insulin. Eating carbohydrates raises blood sugar, insulin lowers it, these need to be matched pretty precisely, and the whole thing is somewhat error prone. Too much insulin and blood sugar falls below 70mg/dL, and I get the sugar craving. I've never been below 40mg/dL, but people who do become mentally impaired, then lose consciousness, then die.
Under the usual theory of obesity, craving sugar would mean that I had been hijacked by the superstimulus of processed food, and that willpower was my defense against this hijacking. But actually, in this case, the craving is a safety mechanism. Sugar craving is to dangerous hypoglycemia as thirst is to dehydration.
With that example in mind, I started thinking about the double-digit percentage of people who drop out of weight-loss studies. And the much-larger percentage of people who start weight loss diets, privately resolved to continue until they reach a target weight, and stop early. What would happen to them, in the counterfactual world where the diet was enforced perfectly from outside, and willpower wasn't an issue? Whether they would lose weight, seems like very much the wrong question to ask.
I think they may be a negative correlation between short-term and long-term weight change on any given diet, causing them to pick in a way that's actually worse than random. I'm planning a future post about this. I'm not super confident in this theory, but the core of it is that "small deficit every day, counterbalanced by occasional large surplus" is a pattern that would signal food-insecurity in the EEA. Then some mechanism (though I don't know what that mechanism would be) by which the body remembers that happened, and responds by targeting a higher weight after return to ad libitum.
Yesterday, I wrote a post about the Regression to the Mean Diet. The biggest impact knowing about the Regression to the Mean Diet has had for me is on my interpretations of studies, where it's a lens that reveals what would otherwise be the best studies to be mostly useless, and of anecdotes, where it makes me heavily discount claims about a new diet working unless I've gotten to ask a lot of questions about the old diet, too. But there's one other implication, which I left out of the original post, because it's kind of unfortunate and is a little difficult to talk about.
I'm not interested in nutrition because I care about weight, or body aesthetics, or athletic performance. I care about nutrition because I believe it has a very large, very underappreciated impact on individual productivity. Low quality diets make people tired and depressed, so they don't get anything done.
The Regression to the Mean Diet predicts that if you reroll the eating habits of someone whose diet-related health is unusually bad, then their new diet will probably be an improvement. This has a converse: if you reroll the eating habits of someone whose diet-related health is good, especially if that person is a peak performer in some way, then their new diet will be worse.
Under this model, one of the most destructive things you could do would be to identify top performers in important areas, people in good health with no nutritional problems, and convince them they need to change their diet.
Which brings me to vegan outreach within the Effective Altruism movement.
I don't think an animal's suffering is anywhere close to as bad as a similar amount of suffering in a human, but I do think it matters, and that this makes modern factory farming quite bad. While I have qualms about the quality of vegan diets in practice, I think that if you convince an average person from the general public to switch from an omnivorous diet they haven't thought much about to a vegan diet with any thought at all put into it, this will on average be an improvement. I think externally-facing vegan outreach is good, and while I wouldn't prioritize it over AI alignment or anti-aging research, I am in favor of it.
But inward-facing vegan outreach scares me. Because EA is in fact seeking out top performers in important areas, and introducing them to its memes. Under the current social equilibrium, those people feel some pressure to reduce their meat consumption, but not many make large dietary changes; most of the people who are vegetarian or vegan within EA where vegetarian or vegan beforehand. It's easy to imagine a different equilibrium, in which the majority of omnivores who get involved in EA go vegan.
I worry that in that world, what would be the top-percentile people are no longer top percentile, and no one notices the absence or makes the connection.
The altitude correlation would seem to suggest drinking water in particular as a culprit, and suggests a simple and straightforward study that would settle the question once and for all: randomize a group of households to either receive reverse osmosis filters on their taps, or not, then track whether the people in those households become obese.
I checked whether this study has been performed, and as far as I can tell, it hasn't. There have been studies that randomly installed reverse osmosis filters, but they were checking for something else and didn't track peoples' weight.