LESSWRONG
LW

2720
jwoodward48
130160
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No posts to display.
No wikitag contributions to display.
LessWrong and Miri mentioned in major German newspaper's article on Neoreactionaries
jwoodward489y10

"Yudkowsky founded the Machine Intelligence Research Institute, with the money of venture capitalist Peter Thiel, which focusses on the development of a benevolent AI. It should be benevolent, because it will have power over people, when it gets smarter than them. For the neoreactionaries, Intelligence is necessarily related to politics. based on the concept of human biodiversity, they believe that social and economical differences are caused and justified by a genetic difference in intelligence among ethnic groups. They reject the idea of a common human nature."

Oh, come on, that's a poorly-thought-out attack. "Yudkowsky thinks that AI will be super-powerful. Neo-reactionists think that powerful people are powerful and smart for genetic reasons. Therefore, Yudkowsky has something to do with neo-reactionism." Really?

Reply
Great post on Reddit about accepting atheism
jwoodward489y10

A person is nature plus nurture, and besides, I'm not even sure if DNA alone would produce the same baby. Epigenetics, womb variation, and whatnot all have an effect even before a child is born.

Reply
Why Truth?
jwoodward489y20

I know! Is the world not more beautiful when one can understand how it works?

Reply
Einstein's Arrogance
jwoodward489y10

"The probability that the universe only has finite space is not exactly 1, is it?"

Nooooo, that's not it. The probability that the reachable space from a particular point within a certain time is finite is effectively one.

So it doesn't matter how large the universe is - the aliens a few trillion ly away cannot have killed Bob.

Reply
Why Don't Rationalists Win?
jwoodward489y20

Hmm? Ah, I see; you think that I am annoyed. No, I only quoted Lumifer because their words nearly sufficed. Rest assured that I do not blame you for lacking the ability to gather information from the future.

Reply
Why Don't Rationalists Win?
jwoodward489y00

(I recognize that you meant instrumental rationality rather than epistemic rationality, and have read the comment with that in mind.)

Epistemic rationality is not equivalent to "being a Spockish asshole." It simply means that one values rationality as an end and not just a means. If you do not value correcting people's grammar for its own sake, then there is no reason to correct someone's grammar. But that is an instrumental statement, so I suppose I should step back...

If you think that epistemic and instrumental rationality would disagree at certain points, try to reconsider their relationship. Any statement of "this ought to be done" is instrumental. Epistemic only covers "this is true/false."

Reply
Why Don't Rationalists Win?
jwoodward489y20

Sounds meaninglessly deep to me.

Reply
Why Don't Rationalists Win?
jwoodward489y10

"See ETA to the comment." Lumifer meant instrumental rationality.

Reply
Righting a Wrong Question
jwoodward489y10

Well, the problem with the Doomsday Argument is not the probability distribution, as I see it, but the assumption that we are "typical humans" with a typical perspective. If you think that the most likely cause for the end of humanity would be predictable and known for millennia, ferex, then the assumption does not hold, as we currently do not see a for-sure-end-of-humanity in our future.

Reply
Righting a Wrong Question
jwoodward489y-10

Not entropy, but rather causation; time does not flow backwards because what I do tomorrow will not affect what I did yesterday.

Reply
Load More