I'm looking for feedback on the following idea. The article from which it's been excerpted can be found here: http://ieet.org/index.php/IEET/more/torres20120213
"But not only has the number of scenarios increased in the past 71 years, many riskologists believe that the probability of a global disaster has also significantly risen. Whereas the likelihood of annihilation for most of our species’ history was extremely low, Nick Bostrom argues that “setting this probability lower than 25% [this century] would be misguided, and the best estimate may be considerably higher.” Similarly, Sir Martin Rees claims that a civilization-destroying event before the year 02100 is as likely as getting a “heads” after flipping a coin. These are only two opinions, of course, but to paraphrase the Russell-Einstein Manifesto, my experience confirms that those who know the < most tend to be the most gloomy.
"I [would] argue that Rees’ figure is plausible. To adapt a maxim from the philosopher David Hume, wise people always proportion their fears to the best available evidence, and when one honestly examines this evidence, one finds that there really is good reason for being alarmed. But I also offer a novel — to my knowledge — argument for why we may be systematically underestimating the overall likelihood of doom. In sum, just as a dog can’t possibly comprehend any of the natural and anthropogenic risks mentioned above, so too could there be risks that forever lie beyond our epistemic reach. All biological brains have intrinsic limitations that constrain the library of concepts to which one has access. And without concepts, one can’t mentally represent the external world. It follows that we could be “cognitively closed” to a potentially vast number of cosmic risks that threaten us with total annihilation. This being said, one might argue that such risks, if they exist at all, must be highly improbable, since Earth-originating life has existed for some 3.5 billion years without an existential catastrophe having happened. But this line of reasoning is deeply flawed: it fails to take into account that the only worlds in which observers like us could find ourselves are ones in which such a catastrophe has never occurred. It follows that a record of past survival on our planetary spaceship provides no useful information about the probability of certain existential disasters happening in the future. The facts of cognitive closure plus the observation selection effect suggest that our probability conjectures of total annihilation may be systematically underestimated, perhaps by a lot."