Sweetgum
Sweetgum has not written any posts yet.

I think you're massively overestimating Eliezer Yudkowsky's intelligence. I would guess it's somewhere between +2 and +3 SD.
Who are some other examples?
What did you think of?
Even if it wasn't meant to be an allegory for race science, I'm pretty sure it was meant to be an allegory for similarly-taboo topics rather than religion. Religious belief just isn't that taboo.
Hmm, it seems like you might be treating this post as an allegory for religion because of the word "agnostic", but I'm almost certain that it's not. I think it's about "race science"/"human biodiversity"/etc., i.e. the claim "[ethnicity] are genetically predisposed to [negative psychological trait]".
Before I do that, though, it's clear that horrible acts have been committed in the name of dragons. Many dragon-believers publicly or privately endorse this reprehensible history. Regardless of whether dragons do in fact exist, repercussions continue to have serious and unfair downstream effects on our society.
While this could work as a statement about religious people, it seems a lot more true for modern racists than modern religious... (read more)
There are more rich people that choose to give up the grind than poor people.
Did you mean to say "There are more poor people that choose to give up the grind than rich people?"
So, according to this estimate, if we could freeze-frame a single moment of our working memory and then explain all of the contents in natural language, it would take about a minute to accomplish.
This seems like a potentially misleading description of the situation. It seems to say that the contents of working memory could always be described in one minute of natural language, but this is not implied (as I'm sure you know based on your reasoning in this post). A 630-digit number cannot be described in one minute of natural language. 2016 bits of memory and about 2016 bits of natural language per minute really means that if our working memory... (read more)
Even assuming perfect selfishness, sometimes the best way to get what you want (X) is to coordinate to change the world in a way that makes X plentiful, rather than fighting over the rare Xs that exist now, and in that way, your goals align with other people who want X.
E.g. learning when you're rationalizing, when you're avoiding something, when you're deluded, [...] when you're really thinking about something else, etc.
It seems extremely unlikely that these things could be seen in fMRI data.
I think I got it. Right after the person buys X for $1, you offer to buy it off them for $2, but with a delay, so they keep X for another month before the sale goes through. After the month passes, they now value X at $3 so they are willing to pay $3 to buy it back from you, and you end up with +$1.
But are you sure the way in which he is unique among people you've met is mostly about intelligence rather than intelligence along with other traits?