Unconvenient consequences of the logic behind the second law of thermodynamics

The prediction that the sun and stars we perceive will go out is absurd only because you are excluding the possibility that you are dreaming.  Because of what we label as dreams we frequently perceive things that quickly pop out of existence.

Extracting Money from Causal Decision Theorists

I'm confused since as a buyer if I believed the seller could predict with probability .75 I would flip a fair coin to decide which box to take meaning that the seller couldn't predict with probability .75.  If I can't randomize to pick a box I'm not sure how to fit in what you are doing to standard game theory (which I teach).

Appendices to cryonics signup sequence

"Over the past few years, some people have updated toward pretty short AGI timelines. If your timelines are really short, then maybe you shouldn't sign up for cryonics, because the singularity – good or bad – is overwhelmingly likely to happen before you biologically die"  

But such a scenario means there is less value in saving for retirement and this should make it financially easier for you to sign up for cryonics.  Also, the sooner we get friendly AGI, the sooner people in cryonics will be revived meaning there is a lower risk that your cryonics provider will fail before you can be revived.  

The Case for a Journal of AI Alignment

Strongly agree.  I would be happy to help.  Here are three academic AI alignment articles I have co-authored. https://arxiv.org/abs/2010.02911  https://arxiv.org/abs/1906.10536 https://arxiv.org/abs/2003.00812

Why the outside view suggests that longevity escape velocity is a long time away and cryonics is a much more feasible option for those alive today: signal-boosting a comment by Calm-Meet9916 on Reddit

While not captured by the outside view, I think the massive recent progress in machine learning should give us much hope of achieving LEV in 30 years.  

Covid 12/24: We’re F***ed, It’s Over

Yes, the more people infected with the virus, and the longer the virus is in people the more time for a successful mutation to arise.

Covid 12/24: We’re F***ed, It’s Over

I did a series of podcasts on COVID with Greg Cochran and Greg was right early on.  Greg has said from the beginning that the risk of a harmful mutation is reasonably high because the virus is new meaning there are likely lots of potential beneficial mutations (from the virus's viewpoint) that have not yet been found.


[Linkpost] AlphaFold: a solution to a 50-year-old grand challenge in biology

From an AI safety viewpoint, this might greatly increase AI funding and drive talent into the field and so advance when we get a general artificial superintelligence.

Snyder-Beattie, Sandberg, Drexler & Bonsall (2020): The Timing of Evolutionary Transitions Suggests Intelligent Life Is Rare

Yes for high concentration of observers, and if high tech civilizations have strong incentives to grab galactic resources as quickly as they can  thus preventing the emergence of other high tech civilizations, most civilizations such as ours will exist in universes that have some kind of late great filter to knock down civilizations before they can become spacefaring.

Load More