So I'd intended this story as a bit of utterly deranged fun, but it got out of control and ended up as a deep philosophical exploration, and now those of you who care will have to wade through the insanity.  I'm sorry.  I just can't seem to help myself.

I know that writing crossover fanfiction is considered one of the lower levels to which an author can sink.  Alas, I've always been a sucker for audacity, and I am the sort of person who couldn't resist trying to top the entire... but never mind, you can see for yourself.

Click on to read my latest story and first fanfiction, a Vernor Vinge x Greg Egan crackfic.

Moderation Guidelines: Reign of Terror - I delete anything I judge to be annoying or counterproductiveexpand_more

Ravna is very sure that the process of escaping from simulations must bottom out in finitely many steps. However there's a technique for implementing an infinite tower of simulations: an environment in which you can always break out of the current level of interpretation and get at the interpreter for that level, and go on digging indefinitely. More here.

Of course, there's still a base level -- the trick it uses is to create the interpretive levels only when they're entered. But even if you could somehow get out of the whole infinite tower, you still couldn't be sure that the level you found yourself at hadn't also just been created.

"Three", Pham replied.

(He was, after all, the only one who had achieved universe tunneling within a universe.)

Took me a minute to get it. Since he doesn't know for sure that his Qeng Ho memories really happened in Ravna's universe, how should he count? Presumably the gang's encountered the situation before and has a rule decided on, though; or maybe it'll be possible to reconstruct a remnant of Old One.

"Two" (says the version of me that I imagined making this comment before I actually went ahead and did so, thus letting the imagined version of me become the actual me, thus justifying the claim of "Two" :))

I really enjoyed the story, and I have to say that I prefer your ending to Permutation City than the one that Egan wrote.

The story is an alternate history of A Fire Upon the Deep, but it's a sequel to Permutation City - it's not an alternative to Egan's ending, but something that could have happened after Egan's ending took place as written.

Oh, I understood that. Except that your explanation of what happened at the end of Permutation City made sense whereas how that story actually ended did not. Hence I prefer your explanation of the ending of Permutation City to the one provided in the book.

"Let there be alcohol!" A moment later, bottles began to rain down from the sky...

Delayed-action popping-tonic?

Frigging awesome. (I haven't read Permutation City, but have now bumped its to-read status from maybe to definitely.)

Of the characters not already identified, I'm afraid the only ones I recognized are Louis Wu and the Lensman.

I think I have the solution to the problem of how to weight the runtime of programs to produce coherent experiences. (I worked this out as a response to Hume's problem of induction, because at the time I was studying the problem, I hadn't yet heard of the Solomonoff prior.)

My solution is this: in a nutshell, if an unknown program outputs a million 0 bits in a row, we want to believe the next bit is more likely to be 0 than 1. Can we do this even if the program is more than a megabit long?

Yes. Most long programs won't output a million 0 bits in a row. Of those that do, most are not doing so because they contain such a string in a literal print statement. They are doing so because execution got hung up in a small loop. And in that case, it will probably stay in that small loop at least for the moment. So we don't need a bias in favor of short programs; we can expect our experiences to be orderly with generic weighting.

Just to verify... The man in the bloodstained sweater, that was the "Ultimate Battle of Ultimate Destiny" version of Mr. Rogers?

Anyways, cool story. :)

He should have been identified unambiguously in the second chapter, when he said "It's a beautiful day in this neighborhood." If anyone didn't notice the second chapter, I'll have to add a warning in the intro that there are 2 chapters (= 3 sections total including the intro), for those not used to fanfiction.net.

Yes, that identified him pretty unambiguously as Mr. Rogers... I was trying to figure out the significance of the bloodstained sweater, if he was thus the "Ultimate Battle" version. :)

Oh, incidentally... It should actually be possible to set up the base level computer to run all the programs for equal amounts of time (at least at the base level. Taking into account programs containing other programs is a whole other issue):

Program 1: 1 step

Program 1: 1 step

Program 2: 2 steps

Program 1: 1 step

Program 2: 1 step

Program 3: 3 steps

Program 1: 1 step...

...

Program 4: 4 steps

... etc.

Interesting; I confess I hadn't thought of that at all! Now I wonder if using this rule along with the underlying anthropic premise, would cause subjective experience to dissolve into chaos, or make no discernable difference (i.e. reality still ends up looking just as ordered for the most part), or if it argues against the underlying anthropic premise by showing how easy it is to make probabilities refuse to converge to a timeless limit.

(And yes, it's that Rogers - you can tell because he's the closest thing the group has to a leader. One wonders how the blood got on his sweater. Surely it's not the blood of an enemy, as the original song implies. Perhaps it's the blood of Big Bird, who died fighting for Amber, or something along those lines.)

The bloodstained sweater in the original song refers to an urban legend that Mr. Rogers was a Marine Sniper in real life.

Thanks. :) This story simply made me wonder if it was possible to create a "fair" scheduled version of the program-of-all-possible-programs, and this was the first one I came up with. Not sure if there are any more elegant ways of doing that.

Though, of course, this wouldn't change the whole issue of programs being embedded in other ones... And, actually, my instinct is that, given that all programs are actually run with an unbounded number of cycles, I'm not sure that the RATE at which they're run relative to each other would affect the amount of reality-fluid each one got, but there's much here that I'm confused about on that matter...

(EDIT: to clarify, I don't think the relative rate they're run at would make any difference.)

Oh, just for clarification: With the bit about Maria distributing the computing time exponentially according to complexity, did you mean each higher (starting?) complexity program got exponentially less time or exponentially more time? And what was Maria's motivation there for that scheduling rule?

Simpler programs got exponentially more time. Mostly she's just trying to match the "natural" distribution of programs, if there is such a thing. Allocating more time to simpler programs may help because it means that, e.g., simple programs which also simulate all programs in order, will get a lot of computing power, so it helps equalize the flow in a way that doesn't depend as much on your initial choice of universal machine. Another way of looking at it would be that allocating equal time to all the programs would tend to make life less simple - to increase the probability of arbitrary things happening - which seems like a net negative for sentient life, ceteris paribus.

(Alternatively, I wonder if Mr. Rogers has a Superpowered Evil Side and that's how he got the blood on his sweater.)

Okay then. You may want to edit the phrasing. As written in the story, it seemed a bit ambiguous but leaning toward her stating that she set it up to give more complex programs more time. At least so it read to me.

Hrm... super powered evil side for Mr. Rogers. Given that his good side could wrap senate committees around his fingers (Seriously, did you ever watch that vid of him testifying about the importance of not canceling funding for public broadcasting?) just by being in real life the way he was on his show...

But yeah, that story was fun. As delightfully twisted as Fractran. (Yes, I am comparing a story to a model of computation. But, given the nature of the story, is this not perfectly reasonable? :))

I'm not convinced that when you look at the whole set of minds doing dovetailing simulations and put probability distributions on how far they go, your algorithm and Eliezer's give different results. Actually calculating it out looks a bit tough; my intuition is based on the fact that a simulator doing N computations gives program n of the order of Sqrt(N) computations using either algorithm, provided that n << N.

Well, since any finite number is smaller than infinity, for ANY program, once it starts running, it would get just as many steps per, well, step, as any other program (in the original version). ie, consider two programs A and B such that A came earlier. In the original scheduler, once B started up, for each tick A gets, B would also get one tick. But A would also have an initial bunch of ticks that it got before B even started.

My version makes sure that B gets those extra ticks too, that's all. I, personally, don't think it would change the probability distributions that would be experienced from the inside, given that the base computation really is run with unbounded resources and so on and so forth.

Ah, I was thinking more of a huge (infinite?) set of simulators, each running for some finite number of ticks. Then the subjective probability of being in program number n is related to the proportion of simulators that run program n for long enough to reach a feasible world for you to be in. So, sure, program A gets more ticks than B in the original scheduler, but I think the determining factor is how many simulators go on to run B at all.

Ooooooh. No, I guess the model we're using here (that is, the fanfic in question) is that somewhere down the levels there is a single simulator running a "program of all possible programs".

Although, I wonder if we can then just say the bottom level is Tegmark's Level 4 Multiverse and get rid of any actual machine or such at the lowest level. :)

Although, I wonder if we can then just say the bottom level is Tegmark's Level 4 Multiverse and get rid of any actual machine or such at the lowest level. :)

Tegmark's Level 4 doesn't answer the question of how much weight each experience has. It's a similar problem to asking where do the Born probabilities come from.

Well, it doesn't seem to me that it'd be any more confusing than "turing machine running the program of all programs" as far as difficulty of reasoning about weights.

Thanks! :)

For my next trick: A scheduler that can look inside a program, take into account the fact that it's encoding another program, and schedule stuff so that the total number of ticks that each program gets, summed across all instances of the program, including those embedded in other programs, would be approximately the same for all.

Or I'll just leave this one as an exercise for the reader. :)

And since somewhere in this discussion the link is deserved: The Ultimate Showdown of Ultimate Destiny

Concepts contained in this story may cause SAN Checking in any mind not inherently stable at the third level of stress. Story may cause extreme existential confusion. Story is insane. The author recommends that anyone reading this story sign up with Alcor or the Cryonics Institute to have their brain preserved after death for later revival under controlled conditions. Readers not already familiar with this author should be warned that he is not bluffing.

LOL :D

I have to say: as awesome as this fanfic was, I think if I had seen it before I had ever donated to SIAI and I had recognized half the Reddit & other obscurities, I wouldn't've sent in so much as a cent!

That sounds like something important for me to know about. Why?

The reasoning would go something like this:

  1. FAI is, these people say, an insanely hard problem.
  2. Insanely hard problems require insane amounts of time/dedication/effort.
  3. The author of this has clearly spent a ridiculous amount of time reading/watching/consuming media.
  4. You can't both spend a ridiculous amount of time consuming media/writing fanfics, & a ridiculous amount of time working on FAI.
  5. Since I am observing the former, I can infer the absence of the latter...
  6. If they aren't working hard on FAI, why then should I donate? (Their work is entertaining me quite a bit, yes, but I don't donate to every blog I read.)

Alternate, more fallacious version:

  1. FAI are Serious Business.
  2. This are not serious person.
  3. Therefore, this are not FAI person.*

Now, as for Reddit: I fear to reread the fic to get specifics, but I remember thinking at least twice reading through it, 'isn't that a reference to an obscure in-joke or article I've only ever seen on Reddit?' Whether those were truly Reddit refs/Reddit-reading-sourced or not is almost irrelevant.

EDIT: * I hereby dub this a 'sylolgism'. Go forth and plague the people with fallacious proofs written in lolcat.

It doesn't take ridiculous amount of time to consume enough media -- normal amount of entertainment time would do just fine. Over the years, little things accumulate. I've probably consumed 100 kg of cheese in the last 10 years, but that doesn't make me a cheese-swallowing python.

Sure; but Eliezer isn't that old! And don't forget, these aren't the only ones who he consumed, but the ones he consumed, remembered, and chose to put into the fic.

If we (or more realistically, he, since I'm not sure whether any of the LW readers can identify every ref he used) toted up all the refs and got, say, 120, how much could we infer he has consumed? Surely several many times that, possibly an order of magnitude or 2 more.

And some of the works are very extensive (Evangelion TV, 26 eps at 30 minutes a piece = 13 hours + 3 hours of movies; Oh My Goddess, 39 volumes (20 minutes apiece?), 5 OVAs (half an hour), a movie (2 hours), 50 TV episodes (half an hour) = 42.5 hours). And there's only so much waking time.

And then there's the recency of most of these works. If we make some reasonable assumptions about when Eliezer's most mentally productive period would be, mirroring those for mathematics, he's through a decent fraction of it.

You can just read about them instead of actually reading or watching the entire series. I know a bit about Ah My Goddess, but haven't actually watched any of it.

Incidentally, I recently plowed through the entire Eva TV series, watching one DVD a night, on six non-consecutive nights, and just finished watching End of Evangelion now.

As for the sheer quantity of references, I could probably duplicate them myself by looking through my book collection... by the time I finished high school, I had enough paperback novels to fill several bookshelves.

Is it having hobbies that you object to or that he does not devote sufficient effort to constructing a 'serious' public image? I note that the time wasted with the formalities involved in such signalling are more than enough to become well versed in science fiction.

I think my objection, as a past donor, is that: by producing & deliberately posting in a high-profile place something like this fanfic, Eliezer - the founder & public face of SIAI - is damaging the serious public image which is necessary for SIAI to accomplish its goals.

SIAI works in the real world; I don't think it can afford for its people to make contrarian choices to watch anime instead of signal.

Gwern, I refer you to http://xkcd.com/137/

At the risk of violent downvoting, one of the many reference points that jumped into my mind while reading was 'the closest thing I've experienced to jumping between nested levels of reality is on drugs'.

http://xkcd.com/137/

"Don't let yourself unreflectively fall into a routine" and "don't be emotionally uncomfortable with nonconformity" are of course good advice; "be indifferent to PR when you're trying to do something for which PR actually matters" is bad advice.

How about the middle ground - "If constant PR consideration stops you from expressing yourself all the time, maybe it's time to reconsider your priorities"?

Posting stuff on Facebook that might get you in trouble is the archetype these day I suppose, but I really can't bring myself to care about things like that.

Maybe I just don't have a strong enough terminal value to protect right now, but I find it easier to imagine myself thinking, 50 years hence, "I wish I'd just decided 'to hell with it' and said what I thought" than "I wish I'd shut up, gone with the flow and eased my path."

I'll hit you up in late 2059 and let you know how that went.

That is a reasonable objection. Of course, it is a completely different objection to the one you previously rationalised.

I don't think it is; my first 2 comments were reasoning a potential donor would use, while my comment just above is my actual reasoning as a past donor (thinking about the reasoning of potential donors).

I don't think this is too tricky or abstract a distinction, but then again, if I were just rationalizing my dislike for Eliezer spending his time on fanfics, I suppose the distinction would also make perfect sense to me...

I agree with wedrifid: this is a different complaint. Did you learn something new about EY's media consumption habits from this story that you did not already know from references in his essays? I doubt it. Maybe it is important that it is concentrated, that it is easier to notice than the scattered references. Someone reading his body of work is already committed, or at least has a lot more information to assess.

I suspect that the media consumption is a red herring and the real complaint is about writing fiction or fanfiction. Did you have any reaction to the other works of fiction?

Lest I seem to be judging you (gwern) for expressing disapproval of Eleizer's identity signalling choices I'll note that I have no particular problem with the expression of that preference.

Another reason that a donor could object stems from a core motivation for altruistic donations. Affiliation to high status people and institutions. If the high status figurehead is writing fanfiction then a donor might be expected to resent the perceived devaluation of their investment.

On the other hand, Eleizer's contrarian nature is the reason I am considering donating to SIAI rather than seeking ways to undermine it. I know all too well how status impairs human judgement and thought of amplifying that risk with the creation of a superintelligence. Eleizer's last work of fiction was a damn good metaphor. If I wielded a Sword of Good I would slay the high status courtier but let the contrary Lord of Darkness cast that FAI spell of ultimate power.

Did you learn something new about EY's media consumption habits from this story that you did not already know from references in his essays? I doubt it.

I learned nothing about what kind of media EY consumes. I knew perfectly well that he reads much the same kind of SF/Fantasy/manga/anime that I do (as well as fanfiction). What I learned new was inferences about the degree of his consumption. See my other comment about the orders-of-magnitude difference between consumption and recall.

I think you err in inferences about EY's degree of consumption based on his ease of recall. Given his extreme intelligence, we would expect him to have extraordinary recall relative to almost everybody with similar habits of consumption. Reading/viewing just moderately more than your average avid reader/viewer and having an extraordinary memory seems more than sufficient to explain this case.

And I think your criticism is not really valid given that EY is the mad scientist of the organization. It would be more appropriate -- if relevant, which it doesn't seem to be -- leveled at Michael Vassar, the President of SIAI.

I think you err in inferences about EY's degree of consumption based on his ease of recall. Given his extreme intelligence, we would expect him to have extraordinary recall relative to almost everybody with similar habits of consumption. Reading/viewing just moderately more than your average avid reader/viewer and having an extraordinary memory seems more than sufficient to explain this case.

Perhaps. Short of testimony from EY or testing him, I can't know directly whether it's great recall or great consumption.

And I think your criticism is not really valid given that EY is the mad scientist of the organization. It would be more appropriate -- if relevant, which it doesn't seem to be -- leveled at Michael Vassar, the President of SIAI.

I'll fall back on what I've said before: even if EY is not actually spending so much time on consuming media that it's detrimental to his performance, the appearance is still damaging. What are the odds that every potential donor who sees things like this will just go 'oh, that lovable-scamp/mad-scientist EY!'?

I dunno about you, but in the time period I was raised in, the archetype of 'mad scientist' didn't include "loves fanfiction". (Leaving aside entirely how relevant or important Michael Vassar may or may not be in fundraising & public outreach.)

I think of Feynman as the archetypal mad scientist, and while I don't think he happened to love fanfiction (and actually, don't we mean "writes fanfiction"?), I wouldn't have been surprised to have found out that he did and I wouldn't have thought less of him if he did.

I think the real issue is not that "writes fanfiction" is not part of the archetype but that you have (or think others will have) some kind of moral/emotional reaction to "writes fanfiction" that causes you to think about it in different terms than "writes poetry" or "loves functional programming" or "loves stamp collecting" or "loves civil war re-enactments" or whatever.

I think the underlying question is how inauthentic one should be willing to be in order to "present the best image." You and I both love functional programming, but there are many "Enterprise Architects" that would find passion for functional programming weird and suspect, deeming it pointless love of complexity for the sake of obfuscation. Imagine you were a public figure for a software company that marketed mostly to Enterprise Java shops, and somebody tells you that you should consider avoiding writing publicly about functional programming, working on xmonad, participating in haskell-cafe, because it might give potential customers the wrong impression (however stupid that "wrong" impression might be). If you think that "functional programming" and "stamp collecting" and "writing poetry" are more valid "side passions" than writing scifi or fanfiction, can you give a good explanation for why, or is it just a matter of "what most people would think"?

Robin Hanson wrote about a relevant phenomenon in Why Signals Are Shallow:

We all want to affiliate with high status people, but since status is about common distant perceptions of quality, we often care more about what distant observers would think about our associates than about how we privately evaluate them.

Thus, people can genuinely dislike their allies having an activity that gives shallow negative impression (feel the dislike, not just deem the activity a mistake), even if they understand this first impression to be incorrect, or that any person giving a minute's thought to the question will come to the same conclusion.

After re-reading that, and reflecting on my feelings reading the OP, I think my opinion of Hanson's signaling theories has gone up quite a bit.

This explains a LOT as applied to the feedback I get.

Money is just a proxy. Status makes the world go round.

Besides, don't forget that "there's an insufficient amount of fun in the world" is an EXPLICIT principle and motivation of what Eliezer's doing, right? So, given that he wants to use "mad science" to increase the amount of fun in the world, it's probably useful to see what sorts of fun and goofiness he gets up to, no? (more to the point, if there wasn't any such available, if he was instead all somber and serious, the reaction would be "this way-too-somber-guy is the one that plans on greatly increasing total fun in the world? I wonder what he thinks would count as 'fun'"

If you think that "functional programming" and "stamp collecting" and "writing poetry" are more valid "side passions" than writing scifi or fanfiction, can you give a good explanation for why, or is it just a matter of "what most people would think"?

But this is a specifically empirical question. Go look around the Internet - what's the predominant view of fanfictioners among non-fanfictioners (who are aware of them)? It is very very negative, I mean, close to furry levels of opprobrium. To give an example, here's the first response when I asked for free association for 'fanfiction' in #wikipedia:

11:00:53 < Lubaf> gwern: "Reeeeaaalllly creepy ideas about sexuality."

What do you think the predominant view of functional programming is? 'creepy sex' or 'hopeless loser' or

11:02:45 < quanticle> gwern: Mathematical lambda functions.

Yeah. When I tell CS people my hobby is Haskell, do they back away and in the future avert their gaze from me? Or do they look interested and re-evaluate their opinion of competence? (The stereotype seems to be that if you use Haskell, you must be very smart indeed.)

I cannot speak for 'Enterprise Architects', but I would be surprised if the impression of Haskell among them was overall negative, and that they would hold negative opinions of any Haskell user.

It was a hypothetical, a thought experiment of the form "if it happened that an Enterprise Architect believed..., ", would you let that influence your behavior or would you say the disapproval is their problem (even if it does cost you something to ignore their disapproval)?

It wasn't a very good example though. I guess my point is that the stereotype-based disapproval of ignorant people who think that knowing somebody wrote a fanfiction gives them deep insight into their personality and their worth as human beings is not something to lose sleep over, it's something to be ignored or even ridiculed.

Regarding Reddit references, all that comes to mind for me is "Squirrel Girl as memetic badass", which isn't Reddit-specific.

I don't care so much about spoilers as missing obscure references. What must I have read in this regard to enjoy the thing?

Anyone who got all the references in this story, without Google or reading the comments, wins one hundred geek points.

(Because of the implied depth of coverage required to completely overlap a sample of someone else's reading/viewing.)

I figured part of the enjoyment would be getting together in the comments to work it out.

I got an awful lot of the references but some I can't crack without at least some Googling.

Two references you really, really should have made, but didn't:

  • Stephen King's "Dark Tower" series.
  • The video game "Star Ocean 3".

Also, Rincewind is the Discworld character who got stuck dimension hopping (including a brief visit to the real world), not Vimes.

But having Vimes along would be so much more useful! Rincewind just isn't a team player.

You're probably right about that; Rincewind would be constantly looking for opportunities to run away. He'd fail, but he'd still try. I'd rather bring along Granny Weatherwax or Susan Sto Helit than Vimes, though.

Which would make the best LessWronger? I'd call Vimes the doggedly rational type... but Susan is probably more likely to go about her dogged rationality in a theoretical/mathematical way. And Granny Weatherwax does like methods that WIN, but I'm not sure that she cares very much just how they achieve it.

I wouldn't call Vimes doggedly rational - at least, not compared to Granny, who is as rational as you can get in a world shaped like a disc resting on the backs of four elephants standing on a giant turtle swimming through space. Vimes just wishes the world made sense. Granny knows it doesn't.

Hrm... Can I request some sort of hybrid of Granny Weatherwax and Ponder Stibbons with a bit of Moist von Lipwig thrown in for good measure?

EDIT: and give the resulting being a carefully dose of Klatchian coffee.

Rincewind tends to be terrified of everything... but he does get stuff done. And in "Sceince of Discworld 2", he was the one who orchestrated the entire plot to free roundworld from the elves.

(Although that book pissed me off in other ways. Thermodynamics is just "a theory about gases"?! (That was its explicit justification for stuff for dismissing the relationship between thermodynamic entropy and information entropy))

It's a different Jake. This one, Jake Stonebender, is from the Callahan's Crosstime Saloon stories. (I had to Google the name.) The Jake from the Dark Tower series is Jake Chambers.

This is what I've got. I suspect some of the characters I couldn't place are from anime/manga, which I've never been interested in. Otherwise, in rot13:

Gur guerr vqragvpny-ybbxvat oblf: Gbz Fjvsg?

Gur zvqqyr-ntrq jbzna jvgu n "dhvgr crphyvne qrzrnabe": Nyvpr.

Gnyy zna jvgu n funirq urnq: Ybhvf Jh.

Obl jvgu gur fpne ba uvf sberurnq: Uneel Cbggre.

Zna jvgu gur fubpx bs lryybj unve: Wbua Pbafgnagvar.

Obl va benatr jvagre pbng: Xraal.

Gur gnyy zna jvgu chcvyf "yvxr fgnef va gur avtug": Qernz (gur Raqyrff).

Gur zna jub bayl fnvq "Trg ba jvgu vg!": Wbua Pyrrfr, V fhccbfr?

Gur Anzryrff Bar vf sebz Cynarfpncr Gbezrag.

Gur Qbpgbe jvgu gur fpnes vf gur Sbhegu.

Gur jbzna jvgu gur fvyire ubbc: Kran.

Gur thl jub yvtugf gur gerr fghzc ba sver: Envfgyva Znwrer, V nffhzr?

Gur zhfphyne zna jvgu gur yragvphyne pelfgny: N Yrafzna (Ivetvy Fnzzf?)

For the first one... you mean the ones with the numbers on them? I don't quite understand why you think they are who you seem to think they are.

I thought the numbers were some clever Tom Swift reference, although for the life of me I couldn't figure it out. Swift popped into my mind because there have been at least three Tom Swifts about which it was unknown if they were the same person. I have no idea what those numbered characters might be from.

It's three different versions of Shinji Ikari, each coming from a different fanfic: "Thousand Shinji", "Shinji and Warhammer 40,000", and "Once More With Feeling".

Ah, I was wondering about those - I recognized the Eva characters instantly (I consider myself something of an Eva expert), but I couldn't figure out where the deuce the Shinjis were from (since I've long shunned Eva fanfics).

The first one made me laugh out loud. Three copies of Tom Swift hanging around, um, Asuka Langley Soryu...

The rest are correct, except that Wbua Pyrrfr is more of a generic representative for the show as a whole, and for Ivetvy Fnzzf I had in mind KK, not VS - though on reflection, VS would probably fit in better with the crowd.

Hrm, I thought it was supposed to be specifically "vg'f" thl sbe gur zbagl clguba bar.

Anyone else here ever read the webcomic One Over Zero? If you have, then you know why I'm mentioning it...

I was thinking more along the lines of "Old One was making a low-probability high-expected-value bet on the Zones of Thought being generated by a simulator with access to much larger resources", with the travelers being lockpickers who could operate in the corridor of Beyond, and Pham containing the payload. At least, Pham contained the obvious payload. There might be a much more subtle payload hidden in the interaction characteristics of the whole group, possibly appearing only after considerable computing time.

There might be a much more subtle payload hidden in the interaction characteristics of the whole group, possibly appearing only after considerable computing time.

Interesting. The fact that none of the group members mention this possibility is good evidence that it is true (or there's some other kind of hidden payload). I guess we should expect that Maria's world will be taken over by the Old One eventually.

BTW, have you made any progress on the problem of how much resources an FAI should spend on exploring escapes out of simulation? The last time I asked, you were still stuck.

Hmm, you seem to wedge in general on paradoxes involving very large numbers times very small ones.

I know that my own heuristic is simply to refuse to multiply with very small numbers. This prunes the paradoxes but has obvious disadvantages. The counters "but, number of fundamental particles in universe" and "but, other daft things with similar small probabilities" feel like cop-outs.

Do you always refuse to multiply by small numbers, or only in certain situations? What do you do instead, then? How do you do your physics homework?

(rot13 for Fire Upon the Deep spoiler)

Gung synj vf va gur bevtvany nf jryy; gurer'f ab bgure ernfba gung gur tbqfunggre va Cunz jbhyq unir obgurerq nobhg gur Oyvtug syrrg neevivat nsgre gur Pbhagrezrnfher npgrq. (Naq jnf vg ernyyl jbegu gur fnpevsvpr bs unys bs Fwnaqen Xrv'f syrrg gb fnir Gvarf' Jbeyq sbe n srj zvyyraavn?)

And yes, I just read A Fire Upon the Deep in order to understand the fic.

Yeah, that bothered me too. But maybe Old One didn't know how long it would take to activate the Countermeasure.

After reading the story, it was interesting to see the subject of dovetailer algorithms mentioned over a year earlier here.

On the subject of programs embedded within programs, could you not say that even imagination, writing and reading, the making and watching of movies and all other similar imaginitive, storytelling arts lead to/are acts of simulation, and increase the probability of said universe?

Have you ever read Heinlein's novel "The Number of the Beast"? The main characters invent a device for traveling between universes, and when they use it, they end up in the worlds of their favorite sci-fi novels. They end up concluding that universes are created by the act of imagining them, and that there must be a world in which they themselves are fictional characters thought up by some author.

I should point out here that (although I haven't read it) Lazarus Long is one of the book's characters, making it possibly an early episode in the Crossover if the latter were real.

Hrm... I just realize that I don't THINK I spotted any Potterverse chars in there. (That's mainly noteworthy in its absence simply because of the sheer volume of HP fanfic that's been written.)

EDIT: as jamesmacaulay's reply shows, it was my failure to notice rather than an actual absence.