Liam Donovan

Posts

Sorted by New

Wiki Contributions

Comments

[$20K in Prizes] AI Safety Arguments Competition

Maybe you could measure how effectively people pass e.g. a multiple choice version of an Intellectual Turing Test (on how well they can emulate the viewpoint of people concerned by AI safety) after hearing the proposed explanations. 

[Edit: To be explicit, this would help further John's goals (as I understand them) because it ideally tests whether the AI safety viewpoint is being communicated in such a way that people can understand and operate the underlying mental models. This is better than testing how persuasive the arguments are because it's a) more in line with general principles of epistemic virtue and b) is more likely to persuade people iff the specific mental models underlying AI safety concern are correct.

 One potential issue would be people bouncing off the arguments early and never getting around to building their own mental models, so maybe you could test for succinct/high-level arguments that successfully persuade target audiences to take a deeper dive into the specifics? That seems like a much less concerning persuasion target to optimize, since the worst case is people being wrongly persuaded to "waste" time thinking about the same stuff the LW community has been spending a ton of time thinking about for the last ~20 years]

Don't die with dignity; instead play to your outs

I strongly prefer the "dying with dignity" mentality for 3 basic reasons:

  • as other people have mentioned, "playing to your outs" is too easy to misinterpret as conditioning on comfortable improbabilities no matter how much you try to draw the distinctions
  • relatedly, focusing on "playing to your outs" (especially if you do so for emotional reasons) may make it harder to stay grounded in accurate models of reality (that may mostly output "we will die soon")
  • Operating under the mindset that death is likely when AGI is still some ways around the corner and easy to ignore seems like it ought to make it easier to stay emotionally resilient and ready to exploit miracle opportunities if/when AGI is looming and impossible to ignore

Of these, the 3rd feels the most important to me, partly because I've seen it discussed least. It seems like if Eliezer's basic model is right, a significant portion of the good outcomes require some kind of miracle occuring at crunch time, which will presumably be easier to obtain if key players are emotionally prepared and not suddenly freaking out for the first time (on an emotional/subconscious level). I know basically nothing about psychology, but isn't it a bad sign if you retreat to "oh death with dignity is unmotivating, let's just focus on our outs" when AGI is less salient?

2022 ACX Predictions: Buy/Sell/Hold

Wait why are your predictions for Brazil so far from the market? As of right now, there are 180,000 shares of Bolsonaro on the orderbook under 50c on FTX (avg price of 44c if you buy them all).

Polymarket Covid-19 1/17/2022

Yeah it's definitely against poly's terms of service but not against US law (otherwise they wouldn't be complying with the prohibition on offering their services to US customers)

Polymarket Covid-19 1/17/2022

FWIW it is totally legal for Americans to trade on polymarket via a VPN or similar; it's just not legal for polymarket itself to offer services to people with US IP addresses

Occupational Infohazards

Yep, I wanted to experiment with a central example of a comment that should be in the "downvote/agree" quadrant, since that seemed like the least likely to occur naturally. It's nice to see the voting system is working as intended. 

Zvi’s Thoughts on the Survival and Flourishing Fund (SFF)

I haven't done much research on this, but from a naive perspective, spending 4 billion dollars to move up vaccine access by a few months sounds incredibly unlikely to be a good idea? Is the idea that it is more effective than standard global health interventions in terms of QALYs or a similar metric, or that there's some other benefit that is incommensurable with other global health interventions? (This feels like asking the wrong question but maybe it will at least help me understand your perspective)

Omicron Post #4

Wait, how do you get to 17%-25% chance of a crisis situation if there's only a 2.5% chance of omicron causing severe disease in vaccinated/previously infected people? Isn't that the vast majority of people in the US?

Load More