All of Tiiba's Comments + Replies

My summary of Eliezer's position on free will

I'd like to propose a way for measuring a system's freedom: it is the size of the set of closed-ended goals which it can satisfy from its current state. How's that?

I also think that this is all you really need to not be confused about free will. It's the freedom to do what you will.

1TheOtherDave10yBy "goals," do you mean goals the system currently has? Or goals the system could in principle have? Or something else? If the first, it follows that I can increase a system's freedom by installing in that system additional satisfiable goals. Which is perfectly internally consistent, but doesn't quite seem to map to what we ordinarily mean by freedom. If the second, it follows that if you and I can each achieve N items from that set, we are equally free, even if my N items include everything I want to do and your N items include nothing you want to do. That, too, is perfectly internally consistent, but doesn't quite seem to map to what we ordinarily mean by freedom. I conclude that our confusions about what we ordinarily mean by freedom aren't quite so readily dissolved. Although it's possible you have some third option in mind that I'm not seeing that eliminates these issues.
AI is not enough

"So according to you, the laws of the universe are random. I think this hardly plausible."

I don't see why it is not plausible. It's not like the Universe has any reason to choose the laws that it did and not others. Why have a procedure, algorithmic or not, if there are no goals?

Things you are supposed to like

"Aesthetics is pretty reliable among humans, but what about in minds-in-general"

I don't think that's relevant. A fugue's job description doesn't include entertaining killer robots from outer space, it's supposed to entertain humans.

In general, I think any artwork should be judged (not enjoyed, but judged) based on whether the author succeeded or failed at what [s]he, personally set out to do, and whether it was a hard thing to do - whether it is creating music that is different from all other music in every way imaginable while remaining musical,... (read more)

7orthonormal10yNow I wish there were a classical music piece entitled "Fugue in G for Killer Robots from Outer Space".
Rationality Quotes September 2011

My initial guess was "keep learning, there's always more to learn."

Malice, Stupidity, or Egalité Irréfléchie?

There's also the possibility that you're being inconvenient to them. Say, vegetarians can't go to a true meat lover's party, people who get up early might need ME to get up eartly for whatever reason, and if your business fails and I live with you, that's obviously my problem.

0orthonormal11yThis hypothesis is at least falsifiable- one can test whether the degree of a peer group's opposition to one of their members changing depends substantially on these sort of inconveniences.
3taryneast11yYou do make a good point - especially re: failed businesses. Though I have also observed the opposite far more frequently. A good example would be when I want to go to bed at a sensible hour and I have friends telling me "oh, you can just stay up just this once". The friends gain by my staying up (more people to party with), but don't have to suffer the consequences (eg the lack of my ability to work effectively the next morning). I think this disparity in expected outcomes means they are free to try and "tempt" me to break my new habit that is mildly inconvenient for them.... because they don't have to pay the heavy costs.
4james_edwards11yMore generally, it may be that your unusual choices benefit you, but impose costs on your friends and family. Unusual choices are less "safe" - they can move you farther from ordinary outcomes, and the results are harder to predict. Compare the stereotyped conflicts between parents and their teenaged kids: Teenager (as seen by parents): "Later, olds! I'm going out with my poorly socialised friends to get wasted and hook up (maybe someone will get pregnant). Woo!" Parents (as seen by teenager): "Stop there! Ve have ways of preventing your fun! You are never allowed to do anything that you enjoy, ever!"
1NancyLebovitz11yAt a minimum, if one person in a household is on a significantly different sleeping schedule, it's going to be logistically more difficult-- everyone else is going to need to be careful about noise for extra hours, and people will have less time with each other.
Teachable Rationality Skills

"Let's get a bigger house, further away from work, so it has an extra bedroom in case Grandma comes over"

Not saying this is a bad example, but it COULD be the case that grandma never being able to come over is totally unacceptable. Which is also a pitfall - something can seem trivial until it goes away.

7Paul Crowley11yOnly if there's really no other way for Grandma to come over - not even for example sleeping in the living room so she can have the sole bedroom.
Harry Sue and The Methods of Rationality

So, having thought about it today, I realized that I did overreact. Some of the justifications offered made sense. But I still find McGonnagal's reactions odd. Yes, a kid genius has special needs. But that doesn't mean he should be able to treat others like imbecils. It might've been just shock, but at some point it should have occured to her: I'm letting a prepubescent kid run circles around me. Even if he keeps being right, he's gonna get himself in trouble running his mouth all the time.

I know that she's supposed to be special, that most adults Harry kn... (read more)

Harry Sue and The Methods of Rationality

I honestly have no idea how that has anything to do with what I'm saying.

And I read all the books, and watched the first six movies. I know what the prohesy is.

I honestly have no idea how that has anything to do with what I'm saying.

And I read all the books, and watched the first six movies. I know what the prohesy is.

You honestly can't see how knowing someone is the chosen one destined to save the world from the evil one and who quite possibly already stopped a war impacts on the likely behaviour of a character? Implied via that same knowledge is that Harry has the favour of McGonagall's immediate superior.

It should be overwhelmingly obvious to you how this information is relevant to what you are saying. Any judgement made without considering this context is absurdly ill-informed.

Okay. Professor McGonagall knows that Harry Potter is the one with the power to vanquish the Dark Lord, that he is marked as the Dark Lord's equal, and that he will have power the Dark Lord knows not. (She has, in fact, heard that Prophecy spoken in the terrible hollow echoing voice of Sybill Trelawney.)

Harry isn't acting like a normal eleven-year-old, or any kind of eleven-year-old, and Professor McGonagall has noticed that as well, in as many words.

That's all. If you think, under those circumstances, that the boy ought to be given the back of your han... (read more)

1falenas10811yAs in she knows they need him, so she's making an extra effort to make him like her, and by association, the wizarding world.
Harry Sue and The Methods of Rationality

The infodumps are not what I'm talking about. I wouldn't believe it's Eliezer's writing if people weren't smugly going on about eigenvectors. My concern is that Harry is a complete asshole. And the unrealistic adults.

3wedrifid11yIt's worse when he's trying to make Harry not a complete asshole. That just gets painful.
0Carinthium11yNot counting the Minerva case (on which your view is obvious), what incidents do you consider to involve him being an arsehole?

Aside from Harry's parents, there was only one "unrealistic" adult so far (by Ch6), McGonagall, who assumed Harry might have been abused. Her tolerance is reasonable.

It's irrelevant, though. Harry is behaving strangely, and you assume it's bad writing. I guess, since you have read some fanfiction ("OOC is irritating to me"), you aquired a useful heuristic for filtering out bad fanfic; it's just that it is bound to give some false positives.

5Eliezer Yudkowsky11yIn case you didn't notice due to lack of Potterverse familiarity, it was established back in chapter 5 that Professor McGonagall knows the prophecy.
Harry Sue and The Methods of Rationality

"What is the point saying "yes" or "no"?"

Um, none for you, I suppose. But it might mean some utilons for me. Anyway, a hint about which way the story might be heading would be good. (Comeuppance? Minister Potter? Furry slash?)

7TheOtherDave11yIf you go in expecting the standard narrative where hubris is punished and the status quo maintained, you will be vastly disappointed. Harry certainly fails at stuff and succeeds at other stuff, but at no point does he come to accept that the best thing to do is meekly accept the status quo. My guess, reading between the lines, is that you will like later chapters less and less.
5benelliott11yThere are points in the story where his arrogance results in severe consequences for him, and when other people call him out on some of his actions. There also continue to be points where he wins and proves how much smarter he is than almost everyone else. I still enjoy reading it, but I'm not going to force you to.
5ArisKatsaris11yAh, I didn't realize that such a hint was what you were asking for. Okay: He (eventually) does things that go very very wrong, and he (soon) starts understanding there must be something really really off/weird with him. So, yeah, he does get a comeuppance of sorts, for his arrogance. Though I'm not sure it's yet enough a comeuppance for such arrogance.
Harry Sue and The Methods of Rationality

No, he is a Black Hole Sue, because, as I said, the abuse and condescension he gave to McGonnagal did not result in any consequences. She's his goddamn TEACHER. Teachers don't expect to be treated like idiots.

And WHERE is his brain backfiring? And I know he's wrong. That's the point. But so far, he's winning anyway.

7hairyfigment11yWho do you believe serves as his antagonist? EY says on the author page that if you make Frodo a Jedi, you have to give Sauron the Death Star in order to have a story.
-1Carinthium11yI'm not quite sure the extent to which you're right or wrong, but a few points: 1- Once Minerva sees what she does of Harry's talk with Draco (realistic or not), it seems plausible she'd be impressed and give him slightly more slack because of it. In addition, once she begins to believe Harry has been abused it is possible she'd feel sorry for him. 2- Prior to this, I agree that she should have been significantly more annoyed- unless wizarding culture is very different for some reason, she should at least have been trying to restrain herself. 3- Part of the point of the fic was to attempt to popularise rationality (and it has at least extended the audience for Less Wrong's ideas). Given the necessity of that, changing Harry's backstory was probably the best way to do it.
96 Bad Links in the Sequences [fixed]

We can still try to go for 1337.

0Paul Crowley8y96 would seem like a natural target, no?
[SEQ RERUN] A Fable of Science and Politics

I'm just wondering if you're aware of this post: http://lesswrong.com/lw/lt/the_robbers_cave_experiment/

At first, I thought it's what you're talking about, but realized that the details are different (and kinda cool in a scary way).

Official Less Wrong Redesign: Call for Suggestions

I like it too, but think that just a bit more contrast would be good. Not a lot, but a little. As it is, it feels bland.

Separate morality from free will

"that it is dangerous to communication to use the term 'free will' in any sense other than freedom from causality"

Why is that? There are many things that can keep your will from being done. Eliminating them makes your will more free. Furthermore, freedom from causality is pretty much THE most dangerous definition for free will, because it makes absolutely, positively no sense. Freedom from causality is RANDOMNESS.

"Therefore free will vs determinism is not a productive argument."

We don't have this argument here. We believe that free will requires determinism. You aren't free if you have no idea what the hell is about to happen.

-1wedrifid11yFYI: You can make quotes look extra cool by placing a '>' at the start of the line. More information on comment formatting can be found in the help link below the comment box.
1JanetK11yI have been pointed at those pieces before. I read them originally and I have re-read them not long ago. Nothing in them changes my conviction (1) that it is dangerous to communication to use the term 'free will' in any sense other than freedom from causality, (2) I do not accept a non-material brain/mind nor a non-causal thought process. Also I believe that (3) using the phrase 'determinism' in any sense other that the ability to predict is dangerous to communication, and (4) we cannot predict in any effective way the processes of our own brain/minds. Therefore free will vs determinism is not a productive argument. Both concepts are flawed. In the end, we make decisions and we are (usually) responsible for them in a moral-ethical-legal sense. And those decision are neither the result of free will or of determinism. You can believe in magical free will or redefine the phrase to avoid the magic - but I decline to do either.
0khafra11yTook me a while, but I found it [http://lesswrong.com/lw/2ab/harry_potter_and_the_methods_of_rationality/238c]: "Lena Squatter and the Paragon of Vengeance" by SF author Leonid Kaganov.
Rationality Quotes: April 2011

I will repost a quote that I posted many moons ago on OB, if you don't mind. I don't THINK this breaks the rules too badly, since that post didn't get its fair share of karma. Here's the first time: http://lesswrong.com/lw/uj/rationality_quotes_18/nrt

"He knew well that fate and chance never come to the aid of those who replace action with pleas and laments. He who walks conquers the road. Let his legs grow tired and weak on the way - he must crawl on his hands and knees, and then surely, he will see in the night a distant light of hot campfires, and u... (read more)

0khafra11yHave you translated the whole story, or just this quote? It sounds interesting, and stacks up next to a SF story about somewhat less-than-friendly-AI as a reason I wish I could read Russian.
Rationality Quotes: April 2011

Better yet, don't go gaga. And use anchoring to your advantage - before haggling, talk about something you got for free.

Singularity Institute featured on Philanthroper

Well, if the donations they have to match go beyond what they'd donate anyway, they would donate more than they otherwise would. Plus, the goal is to get YOU to donate more than you otherwise would.

2Giles11yI've rethought and I was half-wrong. It was certainly rational from SIAI's point of view as Tiiba explained. Also it was rational for SIAI supporters to give one of their dollars via Philanthroper as I'm pretty sure the expected extra money SIAI gets is x where $1 < x <= $2. It might not be exactly $2 because we assume the matchers have not already decided exactly how much they will give over their lifetime. Future donations from them will be somewhat based on "have I given enough already?" which would be negatively impacted by past donations. So for people to get their full $2, matchers need to promise to match the money and not feel warm and fuzzy about it.
Size of the smallest recursively self-improving AI?

What's up with the word "foom", and why is it always in all caps? Can we come up with another name for this that doesn't sound like a sci-fi nerd in need of Ritalin?

0alexflint11yYeah I agree. "Intelligence explosion" is bandied about, but I guess that can also refer to Kurzweilian-style exponential growth phenomena. "Hard take-off singularity" is close, too, but not exactly the same. Again, it refers to a certain magnitude of acceleration, whereas FOOM refers specifically to recursive self-improvement as the mechanism. I'm open to suggestions.
0timtyler11ySee http://lesswrong.com/lw/we/recursive_selfimprovement/ [http://lesswrong.com/lw/we/recursive_selfimprovement/] for an attempt at a definition.
AI that doesn't want to get out

And if they're modified? It's a superintelligent AI. You can't take it down with a shotgun, even if it's built into your arm.

0timtyler11yNo, no: tools. If someone has made a machine intelligence, the rest of the planet will probably have some pretty sophisticated equipment to hand. The competiton for machines comes mostly from the previous generation of machines.
AI that doesn't want to get out

Sure, but it's their funeral.

Another AI might succeed, but not humans. I think there would be at least a few weeks before another one appears, and that might be enough time to ask it how to make a true FAI.

0timtyler11yWell, not unmodified humans. You don't execute a 21st century jailbreak with spears and a loincloth. The outside world is not as resource-limited - and so it has some chance of gathering useful information from the attempt.
AI that doesn't want to get out

Well, then they'll have themselves to blame when the AI converts their remains into nanomachines.

Not sure what you're saying.

0timtyler11yYou don't see why people would want to break into a compound containing the first machine intelligence?
AI that doesn't want to get out

I did mention explosions. And gravity? I don't see what it could do with gravity. Although I see that it could do something with vibration.

AI that doesn't want to get out

It's allowed to produce waste heat. I see no reason to let it make anything else. I know it can't actially cut itself off from the universe, but it shouldn't enjoy this fact.

4benelliott11yUnfortunately it can't even limit itself to this. Every object with mass exerts a gravitational attraction on every other object, it can't help but affect the world outside through these means as well, so we have to allow it to do so, which may result in disaster for all we know. We also have to allow some radiation out, since this is also unavoidable. At this point I should point out that detonating a nuclear warhead can probably be fit into the category of "emitting waste heat and radiation".
AI that doesn't want to get out

Why? This is the whole point - to prevent it from interacting with anything not intentionally given to it.

Vizier AIs

A problem I see with such a sequestered AI is... that it doesn't stop other people from building free AIs. You have to be FIRST.

Vizier AIs

Sorry, I deleted the post without realizing you replied to it. I realized it had problems and decided to give it more thought for now.

The "supernatural" category

I wouldn't say it's necessarily mental. But if it's a huge lump of properties that can't be explained by the rules that govern everything else, it would be supernatural. Or it could even be simple, but still have an exception to an otherwise universal rule. For example, in a Tegmark universe governed by the factorial function, finding a 10 could be considered miraculous. In our universe, it could be an object that doesn't cast a shadow, doesn't glow on the underside, and is not transparent.

Also, rstarkov, see my reply in the thread linked above.

These mathy definitions are, of course, for times when "supernatural" isn't just a stand-in for "stop thinking about it!"

1torekp11yYour reply and anonym's are fundamentally right, I believe. To spell it out more, we need to extend the concept of Similarity Clusters to laws. I mean laws as in "natural laws" and perhaps "supernatural laws", not as in rules passed by legislatures. To take the supernaturalists seriously, we have to hypothesize that there are exceptions, perhaps even regular exceptions, to natural laws. Where natural laws conflict with supernatural ones, the hypothesis goes, the supernatural ones triumph. That's what makes them super. By the way, supernatural "laws" might just be descriptions of alleged supernatural properties. E.g., telekinesis is the power to move stuff just by wishing. Doesn't this just push the puzzle back a step? How do we distinguish natural laws from supernatural ones? By clustering. Natural laws form a tightly knit explanatory framework. For example we can explain lots of chemistry via QM. Natural laws use terms like mass, charge, acceleration. Etc. Supernatural items are claimed not to fit into the same tightly knit explanatory framework. They are described using terms with no apparent relation to mass, charge, acceleration. Etc. But, let me say where the "ontologically basic mental things" account is onto something. The paradigm examples of supernatural objects and qualities are usually mental. Or if not the paradigm examples, then at least a large and important category. Since it is indeed hard to see how painfulness or the sensation of sweetness relates to mass, charge, acceleration, etc. - especially if one glosses over the difference between epistemic puzzles and metaphysical ones - the mental has long been an attractive zone for claiming that a different constellation of laws are in play. ETA: I see Manfred beat me to it. I'll leave mine here because my version is a little further out on a limb.
Tortuga Meetups Starting this Thursday

Did I miss something? Why do meetups have summaries agin?

Admit your ignorance

Well, I dunno about that...

Positive Thinking

Actually, I said it wasn't atheist AT ALL. About as atheist as Don Quixote was a knight. Even the atheism was a manifestation of the personality cult of Karl Marx.

Positive Thinking

There was also the issue of communism, which is nothing if not a cult.

4Eugine_Nier11ySo are you trying to say that the USSR wasn't truly atheist. That sounds like no true scotsman [http://en.wikipedia.org/wiki/No_true_scotsman].
A Transhumanist Poem

Well, I think it's pretty nice.

0Swimmer96311yYay. One person.
The non-painless upload

Well, I would NOT press the button. The average copy gets 500 years of being a creationist, plus half of an immortality. My values prefer "short but good".

2Desrtopa11yIf you would value a shorter life for yourself in which you are not a creationist over a longer one in which you are, do you weight the lives of creationists much lower in your utility calculations? Would you rather save one non-creationist than seven creationists?
What's the single best introduction to evolution to give to a creationist?

The problem with some creationists (the ones who get the basics), as I understand it, is not that they don't think evolution is happening, but that they don't think it's fast enough to transform proto-bacterial zero-cellular balls of chemicals into people in a mere three billion years. Although, personally, I think it's a really long time.

Why is reddit so negative?

Somebody wants to suppress advertising, too, though.

Punishing future crimes

What about the schadenfreude fom pissing off Hitler?

Of course, he might become even more psycho from it.

0[anonymous]10yThe Academy of Fine Arts Vienna shouldn't have rejected Hitler!
2endoself11yThat would count as an effect on his future actions.
The Best Textbooks on Every Subject

You guys do what works for you, and I'll do what works for me. Maybe I just don't have the patience. Or maybe you don't have something required to understand lossily compressed info. Or both. I just know that books take all day long and help as much as short online tutorials. And the tutorials are often free.

4PhilGoetz11yHow about you start a thread for recommending online tutorials?
0tel11yIf lecture notes contain as much relevant information as a book, then you should be able to, given a set of notes, write a terse but comprehensible textbook. If you're genuinely able to get that much out of notes, then yes that definitely works for you. The concern is instead if reading a textbook only conveys a sparse, unconvincing, and context-free set of notes (which is my general impression of most lecture notes I've seen). Both depend heavily on the quality of notes, textbook, subject, and the learning style you use, but I think it's a lot of people's experience that lecture notes alone convey only a cursory understanding of a topic. Practically enough sometimes, test-taking enough surely, but never too many steps toward mastery.
The annoyingness of New Atheists: declaring God Dead makes you a Complete Monster?

No, I know all those words, but you're using them way too much. A lot of them are very apt labels, but they just don't look right outside of TV Tropes. Just like outdated slang.

6Emile11yI agree - "Brown Note" and "Berzerk Button" were unneeded. I'd rather we didn't assume any cultural baggage in readers of LessWrong. Using those words is like a sign saying "If you don't know what this means, you're not hip!". Which is especially bad because it's not true - understanging TVTropes slang is not considered a prerequisite for reading LessWrong.
0Raw_Power11y... So I am being... so to speak... Totally Radical... -_-;
6JoshuaZ11yI'm less annoyed by the use of TV Tropes vocab, and more irritated by their incorrect use. For example, he says that "they tend to nickname Hollywood Atheist" when no one uses that term except TV Tropes (and maybe some agnostics and atheists). The term explicitly means a certain caricature of atheism, recognizing that it is a caricature. No theist is going to use the term.
0Raw_Power11yMaybe I should pothole all the wiki words I've been using? I was under the impression they were self-explanatory, but that might be myopia on my part.
The Best Textbooks on Every Subject

They get ME bored. Every book is six hundred to a thousand pages, and when you're done with it, you've got a hundred pages worth of knowledge. I think it's better to memorize some passwords, then separately look up specific ideas that didn't make sense.

1lukeprog11yFair enough. :) I love reading a good textbook. Good nonfiction is so much more exciting for me than good fiction. And of course, I learn far more from good nonfiction.
The Best Textbooks on Every Subject

In college, I found most of the time that the professor's lecture notes contain almost everything of value that both the textbook and the lecture contains, but they contain ten times less text. This led me to believe that textbooks are a terribly inefficient way to convey facts, by comparison to the format of lecture notes. Books are words, words, words, flowery metaphors, digressions, etc. Hell, I don't know what they spend all those words on. But I know that, potentially, lecture notes are one fact after another.

2prase11yI join NihilCredo and lukeprog in this. Textbooks usually have less text than what I would find ideal, not more. Lecture notes (and many textbooks which seemingly obey the even formula to text ratio) take me more time to read than a book which contains the same number of formulas and four times as much text. I can't continue reading after having stumbled upon something which looks like an inconsistency, non-sequitur or counterintuitive definition (that usually first happens on page 5 or so) and then have to spend time trying to find out what is wrong (and if I fail, then must spend some more time persuading myself that it doesn't matter and reading can continue). On the other hand, if the author spends some time and pages explaining, such events occur much less frequently.
3NihilCredo11yI usually find that (good) textbooks can let you learn the subject matter by yourself, whereas lecture notes are excellent reference material but, if you didn't attend the lectures, they're just not going to make for good building material on their own.

I find all those extra words surrounding the bare facts in textbooks to be highly useful. That's what helps me not just memorize the teacher's password but really understand the material at a gut level.

Vegetarianism

See, the difference is not that meat isn't "to my taste". I like the taste. The problem is that it's EVIL.

Working hurts less than procrastinating, we fear the twinge of starting

I have ADD, and I think that I'm somewhere between the two extremes. Although not working is always more fun than working, I find that I can get in the flow on occasion, and crank out a lot. But even my strongest flows are punctuated by many distractions.

The Santa deception: how did it affect you?

Well, the Santa deception wasn't used on me. The false belief I held was that nobody actually takes Santa seriously. And also that I was bought in a store, which made me wonder where the store got me. Although I didn't take that one too seriously either, finding out the truth was still pretty disturbing. You mean he... she... they... EWWW! (Yeah, I got better.)

(Meme) Penis goes where?

0Desrtopa11yI suppose my parents were probably a bit atypical in giving me The Talk (complete with admonishments to Always Use Protection) before owning up to the nonexistence of Santa. I believe I was five at the time.
Load More