Roko

What is our true life expectancy?

The expectation is probably around 1 billion:

10% × 10 billion years (live roughly as long as the universe has existed for already) +

90% × die within 1000 years (likely within 70 for most people here!)

Total: 1 billion

I am ignoring the infinite possibilities since any finite system will start repeating configurations eventually, so I don't think that infinite life for a human even makes sense (you'd just be rerunning the life you already have and I don't think that counts).

Have the lockdowns been worth it?

A spreadsheet model I made investigates the trade-off between lost life-years from covid-19 deaths versus lost life-years from reduced quality of life being locked down. You can make a copy of it and play with the various parameters.

The spreadsheet uses real data about the mortality risk from covid-19 and the population structure (life expectancy, population pyramid) for the USA.

With the parameters that I chose, lockdowns of 1.25 years destroy about 0.25 life-years per person on net.

https://docs.google.com/spreadsheets/d/1wBcHkt9i_4hGSXRrqurr82j-R_X03nyr_Eds6YLjkzc/

The Solomonoff Prior is Malign

adding execution/memory constraints penalizes all hypothesis

In reality these constraints do exist, so the question of "what happens if you don't care about efficiency at all?" is really not important. In practice, efficiency is absolutely critical and everything that happens in AI is dominated by efficiency considerations.

I think that mesa-optimization will be a problem. It probably won't look like aliens living in the Game of Life though.

It'll look like an internal optimizer that just "decides" that the minds of the humans who created it are another part of the environment to be optimized for its not-correctly-aligned goal.

Fermi Challenge: Trains and Air Cargo

Related to this question, I discovered a new rule for Fermi estimates.

If you want to estimate the mean value of a lognormally distributed random variable, giving the middle order of magnitude will be wrong as these are skewed.

There is a simple rule for getting this right that I discovered: take your middle order of magnitude (i.e. if you think it's between 10 and 100, the middle is 10^1.5) and add 1.15 times the square of your estimate of the standard deviation in log-space. So in this case that's something 1.5 +1.15 × 0.5^2. Then do 10 to the power that. This gives you about 60 - twice the answer you would have gotten with the middle order of magnitude.

This applies to things like "how big is a country" and "how many miles of track is there for a country of a given size" which might both be lognormally distributed.

https://www.lesswrong.com/posts/LEntkjvDSxStdGN39/shortform?commentId=soxf4Ynakr7aSjM3m

The Solomonoff Prior is Malign

It seems to me that using a combination of execution time, memory use and program length mostly kills this set of arguments.

Something like a game-of-life initial configuration that leads to the eventual evolution of intelligent game-of-life aliens who then strategically feed outputs into GoL in order to manipulate you may have very good complexity performance, but both the speed and memory are going to be pretty awful. The fixed cost in memory and execution steps of essentially simulating an entire universe is huge.

But yes, the pure complexity prior certainly has some perverse and unsettling properties.

EDIT: This is really a special case of Mesa-Optimizers being dangerous. (See, e.g. https://www.lesswrong.com/posts/XWPJfgBymBbL3jdFd/an-58-mesa-optimization-what-it-is-and-why-we-should-care). The set of dangerous Mesa-Optimizers is obviously bigger than just "simulated aliens" and even time- and space-efficient algorithms might run into them.

Roko's Shortform

One weird trick for estimating the expectation of Lognormally distributed random variables:

If you have a variable X that you think is somewhere between 1 and 100 and is Lognormally distributed, you can model it as being a random variable with distribution ~ Lognormal(1,1) - that is, the logarithm has a distribution ~ Normal(1,1).

What is the expectation of X?

Naively, you might say that since the expectation of log(X) is 1, the expectation of X is 10^1, or 10. That makes sense, 10 is at the midpoint of 1 and 100 on a log scale.

This is wrong though. The chances of larger values dominate the expectation or average of X.

But how can you estimate that correction? It turns out that the rule you need is 10^(1 + 1.15*1^2) ≈ 141.

In general, if X ~ Lognormal(a, b) where we are working to base 10 rather than base e, this is the rule you need:

E(X) = 10^(a + 1.15*b^2)

The 1.15 is actually ln(10)/2.

For a product of several independent lognormals, you can just multiply these together, which means adding in the exponent. If you have 2 or 3 things which are all lognormal, the variance-associated corrections can easily add up to quite a lot.

Remember: add 1.15 times the sum of squares of log-variances!

Fermi Challenge: Trains and Air Cargo

A country with trains might have the equivalent of 3-10 times the length of the country worth of train tracks, and countries are roughly 600-3000 miles across, with perhaps 10-100 countries with a lot of trains that are also not far below the 600 mile size. The geometric means are approximately 5.5vlength , 1350 miles, 30 countries. Multiply all of these and you get 222,750 miles.

Q1 Answer: 222,750 miles.

Post-hoc edit: If you take these numbers and do the right math on them, you get a more accurate answer. The right math for this kind of thing is not just to multiply the geometric means. You want to take 50 draws from a random variable that's the product of size S times length multiplier L, which are lognormally distributed. If you do that you get about 600,000. If you do it for 30, you get 346,000. So, if you use the numbers I gave with some actual math, it's 346,000. But I looked at the correct answer before deciding to bother to do the math.

Air freight is perhaps 10%-30% of passenger travel since I see more passenger planes at airports than freight. Passenger travel is easier to estimate. Say that each of 1,000,000,000 people travel twice per year by air. 15 people plus their baggage is one metric ton. 2 × 17% × 1,000,000,000 / 15 = 22,000,000 metric tons.

Q2 Answer: 22,000,000 metric tons per year for both years.

A Personal (Interim) COVID-19 Postmortem

But what about the ~3 months of lockdown and massive Economic disruption that we had to go through? Don't you that that could have been avoided by closing our borders tightly in January? Do we have evidence to either confirm or exclude that now?

A Personal (Interim) COVID-19 Postmortem

So do you think that the actual travel restrictions that happened were just a waste of time, and we should have had fully open borders?

Or do you think that the restrictions that we had (late and partial) were the optimal disease-fighting policy (again, neglecting political considerations)?

I mean they got the answer right so it seems a little arrogant to call them biased. Maybe they just had a good heuristic to do with preference falsification?