We report here that a CR regimen implemented in young and older age rhesus monkeys at the National Institute on Aging (NIA) has not improved survival outcomes. Our findings contrast with an ongoing study at the Wisconsin National Primate Research Center (WNPRC), which reported improved survival associated with 30% CR initiated in adult rhesus monkeys (7–14years)5 and a preliminary report with a small number of CR monkeys6. Over the years, both NIA and WNPRC have extensively documented beneficial health effects of CR in these two apparently parallel studies. The implications of the WNPRC findings were important as they extended CR findings beyond the laboratory rodent and to a long-lived primate. Our study suggests a separation between health effects, morbidity and mortality, and similar to what has been shown in rodents789, study design, husbandry and diet composition may strongly affect the life-prolonging effect of CR in a long-lived nonhuman primate.

New to LessWrong?

New Comment
6 comments, sorted by Click to highlight new comments since: Today at 10:53 AM

Reposting my comment from gwern's google+ with one edit:

I know we've talked about a very similar study before (it looks like this is a different group with a different set of monkeys), but as always: N! N! N!

I think the approach the authors take is basically worthless. You would have trouble detecting smoking (hazard rate 2 for humans) reliably with only 40 experimental subjects and 46 controls, and I haven't gotten a good estimate on what the hazard rate for CR / IF should be. It's almost definitely not .5, and I would be surprised if it were even as significant as, say, .8. The Bayesian thing to do would be to report "we think the hazard rate is 1.2, but our 5th percentile is X and 95th percentile is Y" (or, ideally, the whole likelihood function).

As well, we're mostly looking at monkeys that have died at or before the median age. (Slightly less than half of the young monkeys are currently alive.) Supposing CR completely eliminated the risk of cancer, at the cost of increasing deaths due to accidents, what would the mortality curves look like? Early on, the CR monkeys would look worse, as they died of more accidents, until every control monkey was eaten by the Gompertz curve and the CR monkeys continued on as a pure exponential.

Things worth noting:

  1. In the other study, none of the CR monkeys were even pre-diabetic, whereas diabetes was rampant among the controls. Here, 2 CR monkeys were diabetic. They note that this is interesting (read: odd).

  2. 0 of the CR monkeys have been diagnosed with cancer; 6 of the control monkeys have already died of it. (This is only p=0.028! N! N! N!)

  3. Diet composition was significantly different. These monkeys were fed wheat, corn, and other things for protein; the other monkeys were just fed lactalbumin. These monkeys had a diet rich in anti-oxidants; the other monkeys might not have.

  4. The other study had a diet with 29% sucrose; this study had a diet with 4% sucrose. Perhaps this handicapped the controls in the other diet.

  5. Both monkeys got the same diet in the other study- which resulted in over-supplementation of the controls. Here, nutritional supplements were handled separately for each group.

  6. The controls in the other study were true ad libitum (read: obese), the controls in this study were on a restricted (but less severely restricted) diet. The ad libitum in the other study probably handicapped their controls.

  7. It looks like adolescence may be the best time to start CR, but judging that from N=40 is probably unwise. (It makes sense biologically, though- less need for an anti-cancer measure when you're young, and starting when elderly might be too late.)

I haven't seen much directly comparing CR and IF, and so I'm doing IF as it's easier and likely roughly as good. I really want to see bigger studies on this, though, and ideally human studies (they're unlikely to be done by the time I need them to be done, but we might as well find that knowledge out for our descendants!).

Also, an additional thing I forgot to post there- the degree of calorie restriction is meaningful, and while I saw the number for the other study, I didn't see them quote their percentage in this study. Mice results have suggested that 10% CR is better than 30% or 40%, and so the percentage is meaningful. (The other study did 30%, IIRC.)

Derek Lowe also commented on the studies. Repeating my comment there:

So the comparison of the two experiments shows that underfeeding results in life extension over monkeys that over-eat, but not over monkeys that eat a normal diet. Where is the surprise there?

ADDED: I just noticed the paragraph here is missing a key bit of information needed to make sense of my comment. The WNPRC experiment, which found positive results from calorie restriction, fed their controls ad libitum, as much as they wanted to eat. The newer NIA experiment fed the controls a standard, healthy diet, and found no effect of diet restriction.

In the NY Times article they mention that there were at least a couple of differences between this study & the positive study that came out in 2009. Anyone care to calculate the Bayesian probability of caloric restriction extending life based on 1 positive study & 1 negative?

Since the 2009 study was, at the time, accused of data-mining with its mortality statistics (this criticism is explained in the NYT article), while this one has not been (so far), I'd regard the 2009 datapoint as weaker than this one and hence the two studies as a weak net negative.

EDIT: And depending on how you interpret the diet of the control animals (the positive 2009 study let controls pig out, the negative 2012 study forced controls on a more moderate normal healthy diet), one could argue that the the 2012 study is much stronger than the 2009 study for the question we really care about: will switching from a healthy moderate diet to an extreme CR-style diet improve my health & longevity?

Closer to even odds than your prior, whatever that might be...