KvmanThinking

Just someone wandering the internet. Someone smart, but not smart like all of you people. Someone silly and creative. Someone who wanders the internet to find cool, isolated areas like LessWrong. 

The universe is so awesome and crazy and interesting and I can't wait for when humanity is advanced enough to understand all of it. While you people figure out solutions to the various messes our species is in, (I would prefer for homo sapiens to still exist in 20 years) I'll be standing by for emotional support because I'm nowhere near smart enough to be doing any of that actually important stuff. Remember to have good mental health while you're saving the world.

Pronouns: he/him

Wikitag Contributions

Comments

Sorted by

Quick thought: If you have an aligned AI in a multipolar scenario, other AIs might threaten to cause S-risk in order to get said FAI to do stuff, or as blackmail. Therefore, we should make the FAI think of X-risk and S-risk as equally bad (even though S-risk is in reality terrifyingly worse), because that way other powerful AIs will simply use oblivion as a threat instead of astronomical suffering (as oblivion is much easier to bring about).

It is possible that an FAI would be able to do some sort of weird crazy acausal decision-theory trick to make itself act as if it doesn't care about anything done in efforts to blackmail it or something like that. But this is just to make sure.

Maybe we should be able to mark comments and posts as "unserious", and people who prefer LessWrong to have a serious tone can simply press a button and not see them. Because as a neurodivergent rationalist, your observation was very amusing.

this is beautiful, but I can't think of anything specific to say, so I'll just give some generic praise. I like how he only used big words when necessary.

Why was this comment so downvoted?

hi :) what was your first attempt at writing? i might be able to tell you why it was rejected

What's an "anti-rationalist" group?

Great response, first of all. Strong upvoted.

 

My subconscious gave me the following answer, after lots of trying-to-get-it-to-give-me-a-satisfactory-answer:

"Everyone tells you that you're super smart, not because you actually are (in reality, you are probably only slightly smarter than average) but because you have a variety of other traits which are correlated with smartness (i.e: having weird hobbies/interests, getting generally good grades, knowing a lot of very big and complicated-sounding words, talking as if my speech is being translated literally from dath ilan's Baseline, and sometimes having trouble sleeping because i feel all weird and philosophical for no reason). In reality these traits do not indicate smartness, they indicate a brain architecture that deviates significantly from the average human brain architecture, and high intelligence is only one type of deviation. You just like to think you're smart, because you like the connotation of the word smart more than you do eccentric. Which you are, by the way."

This is useful, but I don't know how I would "change the equilibrium" that is formed by the connotation that mainstream society has assigned to the word "eccentric"

Would that imply that there is a hard, rigid, and abrupt limit on how accurately you can predict the actions of a conscious being without actually creating a conscious being? And if so, where is this limit?

I guess you mean on an intuitive level, you feel you have X intelligence, but upon self-reflection, you think you have Y intelligence. And you can't change X to match Y.

Yes, that's exactly correct.

Load More