Astor

Philosophy, education, literary studies, computer game design.

Posts

Sorted by New

Wiki Contributions

Comments

Why do you need the story?

Thank you for explaining it. I really like this concept for stories because it focuses on the psychological aspect of stories as understanding something which sometimes is missing in literary perspectives. How would you differentiate between a personal understanding of a definition and a story? Would you?

My main approach to stories is to define them more abstractly as a rhetorical device for representing change. This allows me to differentiatie between a story (changes), a description (states) and an argument (logical connections of assertions). I suppose, in your understanding, all of them would be some kind of story? This differentiation could also be helpful in understanding the process of telling a story versus giving a description.

Unfortunately, you did not explain how your answer relates to "stories have the minimum level of internal complexity to explain the complex phenomena we experience". In your answer you do not compare stories to other ways of encoding information in the brain. Are there any others, in your opinion?

Why do you need the story?

I am eager to explore your answer. Why do you think that "stories have the minimum level of internal complexity to explain the complex phenomena we experience"? Is it only because you suppose we internalize phenomena as stories? Do you have any data or studies on that? What's your understanding of a story? Isn't a straightforward description not even less complex because you do not need a full-blown plot to depict something like a chair?

Erase button

This is the same conclusion and argument I arrived after reading tivelen's comment. But my objection would be that a "momentary fluctuation" generally is not a good moral argument. You could doubt every decision because the time you took to not be considered a fluctuation is arbitrary.

Erase button

I thought about that and also agree with you. But I wanted this room to be thought about as an investigation of personal choice rather than a choice made by others for you. So I opted for the inclusion of this concept. It would be appropriate not to overemphasize this aspect. But it is of course an understandable rejection. Thank you for bringing it to the foreground.

Erase button

This is a thoughtful analysis of possible effects. Thank you for this. I do not want to have such rooms because I do not want to lose anybody ever. But sometimes there is a tendency in humans for quick decisions which would be supported by such an invention. I suppose this thought experiment shows me that blocking access to easy decision making has potential value.

What is the link between altruism and intelligence?

Pain can also be defined for non-biological beings. For me it is just a word indicating something undesirable hardwired into your being. And maybe there is something undesirable for everything in the universe. One rather metaphysical concept could be a virtue of inertia (described as the resistance of any physical object to any change in its velocity). So you could argue, if you understand the movement of an entity (more concretely its goals), you could find a way to harm it (with another movement) which would result in "pain" for the entity. This concept is still very anthropozentric, so I am not sure, if the change in the movement could lead to or already be understood as a positive outcome for humanity. Or maybe it is not registered at all.

What is the link between altruism and intelligence?

One concept in my moral system relies on the question of how you would respond to permanent retaliation, if you would go rogue. Could you stop an endless attack on your wellbeing because you do things that other people hate? In a world with many extremely intelligent beings this could be very difficult, and even in a world with only you as the bad Super-Einstein it would at least be tiresome (or resource-inefficient), so one super intelligent individual would possibly prefer a situation where they do not need to defend themselves indefinitely. This is kind of similar to the outcome of Wait-But-Why's concept of the cudgel (browser search for "cudgel"). Ultimately this concept relies heavily on having at least some possibility of giving a Super-Einstein a small but ineradicable pain. So in my opinion, it is not really applicable to a singularity event. But it could be useful for slower developments.

Book Review Review (end of the bounty program)

Thank you for organizing this program. I really enjoyed the book reviews. Even though I am still a bit shy in commenting and using votes, these posts encouraged me to consider writing something myself in the future.

Book Review: How To Talk So Little Kids Will Listen

Thank you for your work. I really liked the review for your summary of Problem Solving and your general easy-to-read approach. But I also want to have more studies on this kind of education style so I can ground my understanding on independent observations instead of just ideals. I would definitively read a follow-up on the research regarding the books.

Book Review: Philosophical Investigations by Wittgenstein

I see, thank you for that and thank you for the conversation.

Load More