Wiki Contributions

Comments

Does anybody knows any moodtracking app that asks you about your mood at random time of the day? (Simple rating of the mood and maybe some small question about whether something happened that day influencing your mood) All I found needed me to turn on the app, which meant I used to forget to rate my mood or when I was down I just couldn't be bothered. So it would be perfect if it would just daily pop-up an alert, make me choose something and then disappeared.

  1. It must kill you (at least make you unconscious) on a timescale shorter than that on which you can become aware of the outcome of the quantum coin-toss
  2. It must be virtually certain to really kill you, not just injure you.

Both seem to be at odds with Many World Interpretation. In infinite number of those it will just injure you and/or you will become aware before, due to same malfuntion.

Isn't it the formalization of Pascal mugging? It also reminds of the human sacrifice problem - if we don't sacrifice a person, the Sun won't come up the next day. We have no proof, but how can we check?

Good (not only Friendly, but useful to full extent) AI would understand the intention, hence answer that luminous aether is not a valid way of explaining behavior of light.

After years of confusion and lengthy hours of figuring out, in a brief moment I finally understood how is it possible for cryptography to work and how can Alice and Bob share secrets despite middleman listening from the start of their conversation. And of course now I can't imagine not getting it earlier.

Is there a foundation devoted to promotion of cryonics? If no, it would be probably very desirable to create such. Popularizing cryonics can save an incredible amout of existences and so, many people supporting cryonics would probably be willing to donate money to make some more organized promotion. Not to mention personal gains - the more popular cryonics would become, the lower the costs and better logistics.

If you are or know someone supporting cryonics and having experience/knowledge in non-profit organisations or professional promotion, please consider that.

I'm sorry for overly light-hearted presentation. It seemed suited for a presentation of a, to simplify greatly, form of fun.

Waker's reality doesn't really rely on dreams, but on waking in new realities and a form of paradoxical commitment to equally reality she lives in and a random reality she would wake up in.

It's rationale is purely a step in exploring new experiences, a form of meta-art. As human and transhuman needs will have been fulfilled, posthumans would (and here at least I expect future me) search for entirely new ways of existing, new subjectivities. That is what I consider posthumanism, meddling with most basic imperatives of concious existence.

I see as just a one possibility to explore, something to let copies of myself experience. (those are not independent copies however, I imagine whole cluster of myselves interconnected and gathering understanding of each others perceived realities. Those living Waker's lives would be less concerned with existence of other copies, but rather their experiences would be watched by higher level copies)

Disclaimer: This comment may sound very crackpottish. I promise the ideas in it aren't as wonky as they seem, but it would be to hard to explain them properly in such short time.

By living your life in this way, you'd be divorcing yourself from reality.

Here comes the notion that in posthumanism there is no definite reality. Reality is a product of experiences and how your choices influence those experiences. In posthumanism however you can modify it freely. What we call reality is a very local phenomenon.

Anyhow, it's not the case that your computing infrastructure would be in danger - it would be either protected by some powerful AI, much better suited to protecting your infrastructure then you or there would be other copies of you keeping the maintenance in "meatspace" (Again, I strongly believe that it's only our contemporary perspective that makes us feel that reality in which computations are performed is more real then virtual reality).

What's more, a Waker can be perfectly aware that there is a world beyond her experiencing and may occasionally leave her reality.

Well, creating new realities at will and switching between them is an example of Hub World. And I expect that would indeed be the first thing the new posthumans would go for. But this type of existence is stripped from many restrictions, which in a way make life interesting and give it structure. So I expect some of the posthumans (amongst them - me in the future) to create curated copies of themselves, which would gather entirely new experiences, like Waker's subjectivity. (it's experiences would be reported to some top-level copy)

You see, a Waker doesn't consider waking abandoning everything, the way we do. She doesn't feel abandonment, the same way we don't feel we have abandoned everything and everyone in the dream. She has the perfect awareness of current world and a world to be feeling exactly as real.

One other way to state it - staying in a one reality forever is for a Waker feels like (to us) staying in a dream and never waking up to experience the actual reality.

There are, of course, many variants possible. The one I focus on is largely solipsistic, where all the people are generated by an AI. Keep in mind that AI needs to fully emulate only a handful of personas and they're largely recycled in transition to a new world. (option 2, then)

I can understand your moral reservations, we should however keep the distinction between real instantiation and an AI's persona. Imagine reality generating AI as a skilful actor and writer. It generates a great number of personas with different stories, personalities and apparent internal subjectivity. When you read a good book, you usually cannot tell if events and people in it are true or made up; the same goes with skilful improv actor, you cannot tell whether it is a real person or just a persona. In that way they all pass Turing test. However you wouldn't consider a writer killing a real person, when he ceases to write about some fictional character or an actor killing a real person, when she stops acting.

Of course, you may argue that it makes Waker's life meaningless, if she is surrounded by pretenders. But it seems silly, her relationship with other people is the same as yours.

Load More