MSRayne

I'm a very confused person trying to become less confused. My history as a New Age mystic still colors everything I think even though I'm striving for rationality nowadays. Here's my backstory if you're interested.

Wiki Contributions

Comments

Lovely! I'm glad to hear it's making sense to you. I had a leg up in perceiving this - I spent several years of my youth as a paranoid, possibly schizotypal occultist who literally believed in spirits - so it wasn't hard for me, once I became more rational, to notice that I'd not been entirely wrong. But most people have no basis from which to start when perceiving these things!

That actually would also be worthwhile. We will have AGI soon enough, after all, and I think it's hard to argue that it wouldn't be sentient and thus deserving of rights.

I appreciate the argument for clarifying principles, but I'm still not quite sure exactly how you think it's best to find out what they are, or to write a principle-declaration statement. What all goes into such a thing? Are there lists of principles to pick and choose from? Or a pattern language for building your own?

Ah, I see. By the way, I didn't intend to sound hostile or condescending but I may have done so, in which case I'm sorry. I think it's evidence you have potential as a writer, and it's probably longer than anything I've ever written lol (I'm terrible at finishing things) - I just would have preferred you go through a revision phase or two before publishing it, and considered how to make the characters a bit more realistic. (For instance, it seems like he adjusts psychologically to the transformation much faster than a normal person would, and suddenly becomes a hero when previously he was just a kid.)

Another possible moral, btw, is that the freedom to develop oneself and experiment with alternative modes of being and organization is part of what makes us human - not our mere body plan. There is a case to be made that the people of Cloud Nine are more human than the people of the Founder's perfect city.

This is... interesting. I'm not exactly sure what the point is though, and it feels like fanfiction that was written in a stream of consciousness with no revision or planning. In particular I find it hard to believe that an earth-swallowing sea of maximizer AI nano would not... you know, actively do anything other than wait for people to fall into it. 5/10.

The funny thing is that the widespread attempts to make everything child-safe have backfired and made the entire world child-hostile.

It looks like most of my thinking (which, due to perfectionism and laziness, I have not really published anywhere) is about number 3 - characterizing target behavior - and some of 1 as well. I think a lot about what kind of world I actually want to steer towards (3), and that has led me to try to understand the real nature of the things that are ethically relevant (1), like the "self", rights, etc. But as you say I often do feel a bit guilty about not trying hard on 2 and 4. I don't think I really have the mind for that; I'm more a designer than a dev.

That seems like a bad move. Banning people from being human is always counterproductive. Any successful media of the future will be filled with weird porn; nobody wants to be censored.

I think we maybe shouldn't reify "concepts" at all. To some extent it seems to me like there are only patterns of internal and external behavior. Certain stimuli - experiences, feelings, trains of thought - lead to certain sequences of words being produced. The question should not be "what does the word mean?" but rather "what does this word having been used say about the internal state of the one using it, in this context?" Words don't have fixed meanings - they are entirely contextual.

Concepts are not things that actually exist as stable units - our usage of a finite set of words makes us think that they do, but in reality every pattern of neural activations is unique and we reach for the words that seem the least far away in some vector space (generated by training on our past experience of the same words in other contexts) to the position of the thought we actually want to express. Every such action moves the positions of those words a little bit in other people's semantic spaces, leading to semantic drift - something like a rise in entropy, a gas expanding through a room.

Some people, such as myself, are not good at working with others, and end up in the Terry Davis position by default. Only my "knowledge" is mostly abstract philosophical theorizing that I constantly worry is of no use to anyone, and that I have hitherto never been able even to organize into a book or wiki others can read - though I have tried and am trying. What would your advice be to people who 1. have no money with which to start making more money and 2. have limited social skills and get extremely anxious and overwhelmed whenever they try to coordinate with others on any kind of shared project?

Load More