jokerman.site
Made a simplistic app that displays collective priorities based on individuals' priorities linked here.
Hypotheses for conditions under which the self-other boundary of a survival-oriented agent (human or ai) blurs most, ie conditions where blurring is selected for:
"Democracy is the theory that the common people know what they want and deserve to get it good and hard."
Yes, I think this is too idealistic. Ideal democracy (for me) is something more like "the theory that the common people know what they feel frustrated with (and we want to honor that above everything!) but mostly don't know the collective best means of resolving that frustration.
For example, people can have a legitimate complaint about healthcare being inaccessible for them, and yet the suggestion many would propose will be something like "government should spend more money on homeopathy and spiritual healing, and should definitely stop vaccination and other evil unnatural things".
Yes. This brings to mind a general piece of wisdom for startups collecting product feedback: that feedback expressing painpoints/emotion is valuable, whereas feedback expressing implementation/solutions is not.
The ideal direct-democratic system, I think, would do this: dividing comments like "My cost of living is too high" (valuable) from "Taxes need to go down because my cost of living is too high" (possibly, but an incomplete extrapolation).
This parsing seems possible in principle. I could imagine a system where feedback per person is capped, which would incentivize people to express the core of their issues rather than extraneous solution details (unless they happen to be solution-level experts).
I think beliefs habits and memories are pretty closely tied to the semantics of the world "identity".
In America/Western culture, I totally agree.
I'm curious whether alien/LLM-based would adopt these semantics too.
There are plenty of beings striving to survive. so preserving that isn't a big priority outside of preserving the big three.
I wonder under what conditions one would make the opposite statement—that there's not striving.
For example, I wonder if being omniscient would affect one's view of whether there's already enough striving or not.
My motivation w/ the question is more to predict self-conceptions than prescribe them.
I agree that "one's criteria on what to be up to are... rich and developing." More fun that way.
I made it! One day when I was bored on the train. No data is saved rn other than leaderboard scores.
"Therefore, transforming such an unconscious behavior into a conscious one should make it much easier to stop in the moment"
At this point I thought you were going to proceed to explain that the key was to start to bite your nails consciously :)
Separately, I like your approach, thx for writing.
made a silly collective conversation app where each post is a hexagon tessellated with all the other posts: Hexagon