Posts

Sorted by New

Wiki Contributions

Comments

Sorry I wasn't implying very strong confidence. I would give a probability of, say, 65% that my reason is the principal cause of the feelings of Cousin_it

Sure. That's why I said: "I welcome alternative theories" (including theories about there being multiple different reasons which may apply to different extents to different people). Do you have one?

I have personally felt the same feelings and I think I have pinned down the reason. I welcome alternative theories, in the spirit of rational debate rather than polite silence.

sufferer12y-10

It's because talking about the singularity and end-of-world in near mode for a large amount of time makes you alieve that it's going to happen. In the same way that it actually happening would make you alieve it, but talking about it once and believing it then never thinking about it explicitly again wouldn't.

It all depends on how small that small chance is. Pascal mugging is typically done with probabilities that are exponentially small, e.g. 10^-10 or so.

But what about if Holden is going to not recommend SIAI for donations when there's a 1% or 0.1% chance of it making that big difference.

I suspect that Holden would also consider Robin Hanson a competent critic. This is because Robin is smart, knowledgeable and prestigiously accredited.

But your comment has alerted me to the fact that even if Hanson comes out as a flat-earther tomorrow the supporting posts are still weak.

The issue of the two most credible critics of SIAI disagreeing with each other is logically independent of the issue of Holden's wobbly argument against the utilitarian argument for SIAI. Many thanks.

But if there's even a chance …

Holden cites two posts (Why We Can’t Take Expected Value Estimates Literally and Maximizing Cost-effectiveness via Critical Inquiry). They are supposed to support the argument that small or very small changes to the probability of an existential risk event occurring are not worth caring about or donating money towards.

I think that these posts both have serious problems (see the comments, esp Carl Shulman's). In particular Why We Can’t Take Expected Value Estimates Literally was heavily criticised by Robin Hanson in On Fudge Factors.

Robin Hanson has been listed as the other major "intelligent/competent" critic of SIAI. That he criticises what seems to be the keystone of Holden's argument should be cause for concern for Holden. (after all, if "even a chance" is good enough, then all the other criticisms melt away).

This would be a much more serious criticism of SIAI if Holden and Hanson could come to agreement on what exactly the problem with SIAI is, and if Holden could sort out the problems with these two supporting posts*

(*of course they won't do that without substantial revision of one or both of their positions because Hanson is on the same page as the rest of SIAI with regard to expected utility, see On Fudge Factors. Hanson's disagreement with SIAI is a different one; approximately that Hanson thinks ems first is likely and that a singleton is both bad and unlikely, and Hanson's axiology is significantly unintuitive to the extent that he is not really on the same page as most people with regard to what counts as a good or bad outcome)

As Moldbug has convincingly argued on his blog, intellectual fashion among the ruling class follows intellectual fashion on Harvard by an offset of about one generation. A generation after that the judicial and journalist class exiles any opposition to such thought from public discourse

then

creationism is still around

Contradiction much?

because creationism is not a serious threat to The Cathedral

If the "judicial and journalist class" only attacks popular irrational ideas which are "a serious threat to The Cathedral", then what other irrationalities will get through? Maybe very few irrational ideas are a "serious threat to The Cathedral", in which case you just admitted that academia cannot "proselytise it's output". What about antivax? Global warming denial? Theism? Anti-nuclear-power irrationality? Crazy, ill-thought-through and knee jerk anticapitalism of the OWS variety? So many popular irrational beliefs...

But as I said in my comment, there are numerous issues (creationism, moon landing hoax, antivax, global warming denial, and I should add theism) where a large amount of public opinion is highly divergent from the opinions of the vast majority of academics. So clearly the elite universities are not actually that good at proselytizing their output.

Perhaps it has been downvoted because people see elite universities with large endowments and lots of alumni in congress? But still, that money cannot be spent on proselytizing. And how exactly is a politician who went to Stanford or Harvard supposed to have the means and motive to come out against a popular falsehood? Somehow science is not doing so well against creationism. As an example, Rick Santorum went to Penn State (a Public Ivy), but then expressed the view that humans were not evolved from "monkeys". Newt Gingrich actually was a lecturer, and said intelligent design should be taught at school.

EDIT: Also, Yes, I am stupid in an absolute sense. If I were smart, I would be rich & happy ;-0

Load More