Sorted by New

Wiki Contributions


"Because we're discussing an agent that has the freedom to choose between multiple possibilities"

Where is this freedom, exactly? Freedom meaning a lack of being utterly and totally constrained by prior causes.

We're born with a sense of fairness, honor, empathy, sympathy, and even altruism - the result of our ancestors adapting to play the iterated Prisoner's Dilemma. 

The keyword here is *sense*, and there's not a whole lot saying that this sense can't vanishes as easily as it appears. Interpretting a human as a "fair, empathetic, altruistic being" is superficial. The status quo narrative of humanity is a lie/mass delusion, and humanity is a largely psychopathic species covered in a brittle, hard candy shell of altruism and empathy. If this is true, the entire article is confused.

If you have a strong aversion to this claim, imagine how humanity would appear if you weren't human(Don't overthink this-- I'm just asking "make an effort to shed pro-human bias"), with how utterly unphased it is by the knowledge that planet Earth has largely been a meat grinder and torture chamber for billions of years for sentient life. It continues to be an engine of mass suffering today, and it's safe to say the majority of the planet is pretty glib, indifferent, dissociated and detached from this reality. If the gravity of the hell on this planet were felt in an instant, we would all vomit uncontrollably and stab ourselves in the face, but we've evolved to be a psychopathic species that just *doesn't* care that much. It turns out caring or being aware that you're in fact, in a gradation of hell(at least for many living things around you, which I would argue is hell for you, regardless if you can appreciate it or not) is maladaptive.

If one values winning above everything else, then everything that leads to winning is rational. The reductio to this is if torturing a googolplex of beings at maximum duration and increasing intensity leads to winning, then that's what must be done.

Yet... perhaps winning then is not what we should most value? Perhaps we should value destroying the thing which values torturing a googolplex of beings. What if we need to torture half of a googolplex of beings to outcompete something willing to torture a googolplex of beings? What if outcompeting such a thing is impossible? What is the threshold for the number of beings tortured, in total? Such a question must by definition seem irrational to someone winning at all costs, this is the tradeoff one makes for valuing winning at all costs and calling it rationality. At which point does one say, "The most rational move is stopping all forward momentum immediately."?("You are missing the point! Rationality is just your *independant* strategy!" That is missing the point.) This does not appear to be a universe where a system which intends to maximize truth and ethics can win. I suspect once we can transcend temporal bias and egocentric bias via convincing virtual experience, in the specific sense of living lives like Junko Furuta's and Elisabeth Fritzl's, we will not appreciate winning at all costs. The paradox here is the thing which tends to reach convincing virtual simulations is not the thing which values simulating such things. That little voice in your head that says , "Error. Irrational appeal to emotion." is the same voice which tortures the entire multiverse to win(if this is the winning strategy). The conclusion here is that ethics and truth don't win. The thing which is least hindered by a commitment to values other than winning, wins. If anything could be said to bad, that is, if one is not a moral nihilist, then that would be bad news. Again worth noticing the little voice that rejects this word "bad", which upon having one's hands planted into hot coals for no reason, would appreciate things differently and realize an objective property of consciousness that is as grounded as the most basic mathematical expression.