Just someone wandering the internet. Someone smart, but not smart like all of you people. Someone silly and creative. Someone who wanders the internet to find cool, isolated areas like LessWrong.
The universe is so awesome and crazy and interesting and I can't wait for when humanity is advanced enough to understand all of it. While you people figure out solutions to the various messes our species is in, (I would prefer for homo sapiens to still exist in 20 years) I'll be standing by for emotional support because I'm nowhere near smart enough to be doing any of that actually important stuff. Remember to have good mental health while you're saving the world.
Pronouns: he/him
Even in situations where my beliefs affect my action those beliefs are not choices. If I notice that if I had a certain belief I would act in a way that would give me more utility, well then that observation becomes instead my motivation to act as if i have that belief.
So "frusturated" is what we call "annoyed" when it comes from the source of "repeatedly failing to do something"?
Looking carefully at how other people speak and write, there are certain cognitive concepts, that simply don't make sense to me, the way that it "makes sense" to me that red is "hot" and blue is "cold". I wonder if the reason I can't understand them is that my consciousness is somehow "simpler", or less "layered", although I don't really know what I mean by that, it's just a vibe i get. Here are some examples:
I feel a bit confused. I generally sort of just feel (and I don't know what exactly I mean by this) that there is less "structure" separating the "me" part of the Program The Computer-That-Is-My-Brain Is Running from the "not-me" part.
Does any of this make sense?
To me, "sensory experience" as in "the video and audio coming in from this body that I'm piloting" and "sensory experience" as in "a file containing the most recent results of the large hadron collider" are very very different.
If you have enough of one of the two types you can probably infer the other if you are smart enough. They are just different windows into observing the world.
I think most carnists don't eat meat out of selfishness, they either don't believe in animal sentience, or, more likely, are ignorant to what factory farming is like through no fault of their own.
Quick thought: If you have an aligned AI in a multipolar scenario, other AIs might threaten to cause S-risk in order to get said FAI to do stuff, or as blackmail. Therefore, we should make the FAI think of X-risk and S-risk as equally bad (even though S-risk is in reality terrifyingly worse), because that way other powerful AIs will simply use oblivion as a threat instead of astronomical suffering (as oblivion is much easier to bring about).
It is possible that an FAI would be able to do some sort of weird crazy acausal decision-theory trick to make itself act as if it doesn't care about anything done in efforts to blackmail it or something like that. But this is just to make sure.
Maybe we should be able to mark comments and posts as "unserious", and people who prefer LessWrong to have a serious tone can simply press a button and not see them. Because as a neurodivergent rationalist, your observation was very amusing.
this is beautiful, but I can't think of anything specific to say, so I'll just give some generic praise. I like how he only used big words when necessary.
Why was this comment so downvoted?
If Eliezer's p(doom) is so high, why is he signed up for cryonics?