followthesilence

Posts

Sorted by New

Wiki Contributions

Comments

Spoiler: Less than 1% will admit they were wrong. Straight denial, reasoning that it doesn't actually matter, or pretending they knew the whole time lab origin was possible are all preferable alternatives. Admitting you were wrong is career suicide.

The political investments in natural origin are strong. Trump claiming a Chinese lab was responsible automatically put a large chunk of Americans in the opposite camp. My interest in the topic actually started with reading up to confirm why he was wrong, only to find the Daszak-orchestrated Lancet letter that miscited numerous articles and the Proximal Origins paper that might be one of the dumbest things I've ever read. The Lancet letter's declaration that "lab origin theories = racist" influenced discourse in a way that cannot be understated. It also seems many view more deadly viruses as an adjoining component of climate change: a notion that civilizing more square footage of earth means we are inevitably bound to suffer nature's increasing wrath in the form of increasingly virulent, deadly pathogens.

The professional motivations are stark and gross. “It is difficult to get a man to understand something, when his salary depends on his not understanding it.” Thoughts on the origin are frequently dismissed if you're not a virologist. But all the money in virology is in gain of function. Oops!

Apologies, I'm not trying to dispute math identities. And thank you, the link provided helps put words to my gut concern: that this essay's conclusion relies heavily on a multi-stage fallacy, and arriving at point probability estimates for each event independently is fraught/difficult.

Thanks, I suppose I'm taking issue with sequencing five distinct conditional events that seem to be massively correlated with one another. The likelihoods of Events 1-5 seem to depend upon each other in ways such that you cannot assume point probabilities for each event and multiply them together to arrive at 1%. Event 5 certainly doesn't require Events 1-4 as a prerequisite, and arguably makes Events 1-4 much more likely if it comes to pass.

Can you explain how Events #1-5 from your list are not correlated? 

For instance, I'd guess #2 (learns faster than humans) follows naturally -- or is much more likely -- if #1 (algos for transformative AI) comes to pass. Similarly, #3 (inference costs <$25/hr) seems to me a foregone conclusion if #5 (massive chip/power scale) and #2 happen.

Treating the first five as conditionally independent puts you at 1% before arriving at 0.4% with external derailments, so it's doing most of the work to make your final probability miniscule. But I suspect they are highly correlated events and would bet a decent chunk of money (at 100:1 odds, at least) that all five come to pass.