Posts

Sorted by New

Wiki Contributions

Comments

In the UK, I think the most common assumption for cauliflower ear would be playing rugby, rather than a combat sport.

No idea if that's the statistically correct inference from seeing someone with the condition.

I enjoyed filling this out!

The question here is the opposite of its title:

Unknown features Which of the following features of the LessWrong website did you know how to use before you read this question?

That could result in some respondents answering in reverse if they skim.

As well as the generic suggestions people are making in the answers, it seems like you might be able to get more specific suggestions if the question specified whether you're looking for long distance vs. nearby/in-person dating, and (if the latter) a rough idea of where you are located.

You've got an asterisk in the first sentence, but I couldn't see it referencing anything.

~1.2e16 bases annually

Is this a typo? If I'm reading the chart correctly, it looks like it's of the order 1.2e15.

Blaise Pascal – the I Think Therefore I Am guy

 

The 'I think therefore I am' guy was René Descartes.

https://en.wikipedia.org/wiki/Cogito,_ergo_sum

If the market are

 Grammar: "market is" or "markets are"

You mention here that "of course" you agree that AI is the dominant risk, and that you rate p(doom) somewhere in the 5-10% range.

But that wasn't at all clear to me from reading the opening to the article.

Eliezer Yudkowsky predicts doom from AI: that humanity faces likely extinction in the near future (years or decades) from a rogue unaligned superintelligent AI system. ... I have evaluated this model in detail and found it substantially incorrect...

As written, that opener suggests to me that you think the overall model of doom being likely is substantially incorrect (not just the details I've elided of it being the default).

I feel it would be very helpful to the reader to ground the article from the outset with the note you've made here somewhere near the start. I.e., that your argument is with the specific doom case from EY, that you retain a significant p(doom), but that it's based on different reasoning.

Years back I heard that 10 is a bad number for this kind of thing.

The reasoning goes that because it's a round number people will assume that you chose it to be round and that at least some of your entries are filler to get up that number.

Whereas if you have 11 reasons or 7 or whatever, people will think that number is the actual number you needed to make all your points.

Load More