Posts

Sorted by New

Wiki Contributions

Comments

Most of the probabilities are epsilon, and/or 100%-epsilon, and/or larger/smaller epsilons (for "unsure in known psychology" and "unsure in known logic" probability difference).

Maybe will do next year, because I have a lot to change.

I tried to rush the angry comment about how it all is wrong, but a few second ater posting the comment (oops) I understood. I've seen a great example since the school genetics: when two heterozygotes cross (Aa is crossed with Aa), frequency of homozygotes among the descendants with dominant trait is 1/3. AA Aa aA aa (may never survive to the adulthood. Or AA may not survive. Or both survive, but we aren't interested)

There may be something that influences the 1:2:1 proportion (only in one side?), but it's a "You flip a loaded coin. What's your bet on it falling heads?" case.

I've seen quite a few people who said that jews (and only jews) did the Holocaust. Implied reasons differ from "in order to blame everyone and get profit" to "it was a magical sacrifice to get power" (The fact that many other peoples were in the list for annihilation, usually including the statement author's, is hard to plant in their mind. If it was the biggest problem with such a statement...)

I do agree that AI, who is underdeveloped in terms of its goals and allowed to exist, is too likely to become an ethical and/or existential catastrophe, but have a few questions.

  1. If neurosurgery and psychology develop sufficiently, is it ethically okay to align humans (or newborn) to other, more primitive life forms to the extent we want to align AI to humanity (I didn't say "the same way", because human brain seems to be differently organized than programmable computers, but I mean practically the same behaviour and/or goals change)?
  2. Does anyone, mentioning that AI would become more intelligent than whole human civilization, think that AI would be, therefore, more valuable than humanity? Shouldn't AI goals be set with consideration of that? If not, isn't answer for 1) "yes"?

Your link is broken.

Well, cultural relativity is a fact, as there are no morality and people either justify any of their actions via tradition, or simply follow it when they don't want to think. Universal life rights would be great (no less than human rights, at least. I'm one personality legalist and one personality ecocentrist who wants sentience to remain in order to save biosphere from geological and astronomical events that are coming sooner than new Homo sapiens may emerge through evolution if current one is extinct before making AGI) Everything else, I upvote.

[This comment is no longer endorsed by its author]Reply