Because the purpose of horror fiction is to entertain. And it is more entertaining to be wrong in an interesting way than it is to be right. >""I'm going to do high-concept SCP SF worldbuilding literally set in a high-tech underground planet of vaults"I do not consider this story scifi, nor PriceCo to be particularly high tech.>"and focus on the details extensively all the way to the end - well, except when I get lazy and don't want to fix any details even when pointed out with easy fixes by a reader"All fiction breaks down eventually, if you dig deep enough. The fixes were not easy in my estimation. I am thinking now this story was a poor fit for this platform however
You may also enjoy these companion pieces:
I purposefully left it indeterminate so readers could fill in the blanks with their own theories. But broadly it represents a full, immediate and uncontrolled comprehension of recursive, fractal infinity. The pattern of relationships between all things at every scale, microcosm and macrocosm. More specifically to the story I like to think they were never human, but always those creatures dreaming they were humans, shutting out the awful truth using the dome which represents brainwashing / compartmentalization. Although I am not dead-set on this interpretation and have written other stories in this setting which contradict it. Incidentally this story was inspired by the following two songs:
Fair point. But then, our most distant ancestor was a mindless maximizer of sorts with the only value function of making copies of itself. It did indeed saturate the oceans with those copies. But the story didn't end there, or there would be nobody to write this.
Good catch, indeed you're right that it isn't standard evolution and that an AI studies how the robots perish and improves upon them. This is detailed in my novel Little Robot, which follows employees of Evolutionary Robotics who work on that project in a subterranean facility attached to the cave network: https://www.amazon.com/Little-Robot-Alex-Beyman-ebook/dp/B06W56VTJ2
This is a prologue of sorts. It takes place in the same world as The Shape of Things to Come, The Three Cardinal Sins, and Perfect Enemy (Recently uploaded at the time of writing) with The Answer serving as the epilogue.
I appreciate your insightful post. We seem similar in our thinking up to a point. Where we diverge is that I am not prejudicial about what form intelligence takes. I care that it is conscious, insofar as we can test for such a thing. I care that it lacks none of our capacities, so that what we offer the universe does not perish along with us. But I do not care that it be humans, specifically, and feel there are carriers of intelligence far more suited to the vacuum of space than we are, or even cyborgs. Does the notion of being superceded disturb you?
Well put! While you're of course right in your implication that conventional "AI as we know it" would not necessarily "desire" anything, an evolved machine species would. Evolution would select for a survival instinct in them as it did in us. All of our activities you observe fall along those same lines are driven by instincts programmed into us by evolution, which we should expect to be common to all products of evolution. I speculate a strong AI trained on human connectomes would also have this quality, for the same reasons.
Conservatism, just not absolute.
This feels like an issue of framing. It is not contentious on this site to propose that AI which exceeds human intelligence will be able to produce technologies beyond our understanding and ability to develop on our own, even though it's expressing the same meaning.