Roko's Shortform

The US FDA (U.S. Food and Drug Administration)'s current advice on what to do about covid-19 still pretty bad.

Hand-washing and food safety seem to just be wrong, as far as we can tell covid-19 is almost entirely transmitted in the air, not on hands or food; hand-washing is a good thing to do but it won't help against covid-19 and talking about it displaces talk about things that actually do help.

6 feet of distance is completely irrelevant inside, but superfluous outside. Inside, distance doesn't matter - time does. Outside is so much safer than inside that you don't need to think about distance, you need to think about spending less time inside [in a space shared with other people] and more time outside.

Cloth face coverings are suboptimal compared to N95 or P100 masks and you shouldn't wear a cloth face covering unless you are in a dire situation where N-95 or P100 isn't available. Of course it's better than not wearing a mask, but that is a very low standard.

Donating blood is just irrelevant right now, we need to eliminate the virus. Yes, it's nice to help people, but talking about blood donation crowds out information that will help to eliminate the virus.

Reporting fake tests is not exactly the most important thing that ordinary people need to be thinking about. Sure, if you happen to come across this info, report it. But this is a distraction that displaces talk about what actually works.

Essentially every item on the FDA graphic is wrong.

In fact the CDC is still saying not to use N95 masks, in order to prevent supply shortages. This is incredibly stupid - we are a whole year into covid-19, there is no excuse for supply shortages, and if people are told not to wear them then there will never be an incentive to make more of them.

Bet On Biden

it doesn't help that Trump unexpectedly won the 2016 election, transferring a lot of money to biased bettors.

I mean they got the answer right so it seems a little arrogant to call them biased. Maybe they just had a good heuristic to do with preference falsification?

What is our true life expectancy?
Answer by RokoOct 24, 20202

The expectation is probably around 1 billion:

10% × 10 billion years (live roughly as long as the universe has existed for already) +

90% × die within 1000 years (likely within 70 for most people here!)

Total: 1 billion

I am ignoring the infinite possibilities since any finite system will start repeating configurations eventually, so I don't think that infinite life for a human even makes sense (you'd just be rerunning the life you already have and I don't think that counts).

Have the lockdowns been worth it?
Answer by RokoOct 18, 20206

A spreadsheet model I made investigates the trade-off between lost life-years from covid-19 deaths versus lost life-years from reduced quality of life being locked down. You can make a copy of it and play with the various parameters.

The spreadsheet uses real data about the mortality risk from covid-19 and the population structure (life expectancy, population pyramid) for the USA.

With the parameters that I chose, lockdowns of 1.25 years destroy about 0.25 life-years per person on net.

The Solomonoff Prior is Malign

adding execution/memory constraints penalizes all hypothesis

In reality these constraints do exist, so the question of "what happens if you don't care about efficiency at all?" is really not important. In practice, efficiency is absolutely critical and everything that happens in AI is dominated by efficiency considerations.

I think that mesa-optimization will be a problem. It probably won't look like aliens living in the Game of Life though.

It'll look like an internal optimizer that just "decides" that the minds of the humans who created it are another part of the environment to be optimized for its not-correctly-aligned goal.

Fermi Challenge: Trains and Air Cargo

Related to this question, I discovered a new rule for Fermi estimates.

If you want to estimate the mean value of a lognormally distributed random variable, giving the middle order of magnitude will be wrong as these are skewed.

There is a simple rule for getting this right that I discovered: take your middle order of magnitude (i.e. if you think it's between 10 and 100, the middle is 10^1.5) and add 1.15 times the square of your estimate of the standard deviation in log-space. So in this case that's something 1.5 +1.15 × 0.5^2. Then do 10 to the power that. This gives you about 60 - twice the answer you would have gotten with the middle order of magnitude.

This applies to things like "how big is a country" and "how many miles of track is there for a country of a given size" which might both be lognormally distributed.

The Solomonoff Prior is Malign

It seems to me that using a combination of execution time, memory use and program length mostly kills this set of arguments.

Something like a game-of-life initial configuration that leads to the eventual evolution of intelligent game-of-life aliens who then strategically feed outputs into GoL in order to manipulate you may have very good complexity performance, but both the speed and memory are going to be pretty awful. The fixed cost in memory and execution steps of essentially simulating an entire universe is huge.

But yes, the pure complexity prior certainly has some perverse and unsettling properties.

EDIT: This is really a special case of Mesa-Optimizers being dangerous. (See, e.g. The set of dangerous Mesa-Optimizers is obviously bigger than just "simulated aliens" and even time- and space-efficient algorithms might run into them.

Roko's Shortform

One weird trick for estimating the expectation of Lognormally distributed random variables:

If you have a variable X that you think is somewhere between 1 and 100 and is Lognormally distributed, you can model it as being a random variable with distribution ~ Lognormal(1,1) - that is, the logarithm has a distribution ~ Normal(1,1).

What is the expectation of X?

Naively, you might say that since the expectation of log(X) is 1, the expectation of X is 10^1, or 10. That makes sense, 10 is at the midpoint of 1 and 100 on a log scale.

This is wrong though. The chances of larger values dominate the expectation or average of X.

But how can you estimate that correction? It turns out that the rule you need is 10^(1 + 1.15*1^2) ≈ 141.

In general, if X ~ Lognormal(a, b) where we are working to base 10 rather than base e, this is the rule you need:

E(X) = 10^(a + 1.15*b^2)

The 1.15 is actually ln(10)/2.

For a product of several independent lognormals, you can just multiply these together, which means adding in the exponent. If you have 2 or 3 things which are all lognormal, the variance-associated corrections can easily add up to quite a lot.

Remember: add 1.15 times the sum of squares of log-variances!

Fermi Challenge: Trains and Air Cargo
Answer by RokoOct 14, 20202

A country with trains might have the equivalent of 3-10 times the length of the country worth of train tracks, and countries are roughly 600-3000 miles across, with perhaps 10-100 countries with a lot of trains that are also not far below the 600 mile size. The geometric means are approximately 5.5vlength , 1350 miles, 30 countries. Multiply all of these and you get 222,750 miles.

Q1 Answer: 222,750 miles.

Post-hoc edit: If you take these numbers and do the right math on them, you get a more accurate answer. The right math for this kind of thing is not just to multiply the geometric means. You want to take 50 draws from a random variable that's the product of size S times length multiplier L, which are lognormally distributed. If you do that you get about 600,000. If you do it for 30, you get 346,000. So, if you use the numbers I gave with some actual math, it's 346,000. But I looked at the correct answer before deciding to bother to do the math.

Air freight is perhaps 10%-30% of passenger travel since I see more passenger planes at airports than freight. Passenger travel is easier to estimate. Say that each of 1,000,000,000 people travel twice per year by air. 15 people plus their baggage is one metric ton. 2 × 17% × 1,000,000,000 / 15 = 22,000,000 metric tons.

Q2 Answer: 22,000,000 metric tons per year for both years.

A Personal (Interim) COVID-19 Postmortem

But what about the ~3 months of lockdown and massive Economic disruption that we had to go through? Don't you that that could have been avoided by closing our borders tightly in January? Do we have evidence to either confirm or exclude that now?

Load More