Sorted by New

Wiki Contributions


I view unhappiness, like pain, as useful information. If you find your hand over a burner, you turn off the flame rather than reconditioning yourself to enjoy the sensation of scorching flesh. Doing otherwise risks losing the limb entirely. Why should I react differently to oppressive and idiotic social circumstances? I desire external rather than internal change.

As a side, note, this piece reinforces the sort gender ideology and interpersonal hierarchies that contribute to making life unbearable: "Dudes, do this. Girls, do that." It assumes monogamous romantic relationships to be natural, correct, and omnipresent. To the extent that this sort of conformity makes happiness, I want no part in it. Instead I encourage anger and rebellion.

I'm skeptical about the whole practice of studying happiness and trying to be happier based on this body of knowledge. Who knows what self reports actually mean? Social dynamics play a huge role in determine how happy people claim to be. Moreover, the entire enterprise of feeling good for its own sake strikes me as reactionary. Focusing on the personal ignores the social conditions response for so much suffering. I have the same complaints about zen. As Martin Luther King said, I'm proud to maladjusted about the horrors that surround me. I wouldn't want to be content under current nightmarish circumstances.

Considering that medical errors apparently kill more people than car accidents each year in the United States, I suspect the establishment is not in fact infallible.

What do y'all think about John Smart's thesis that an inward turn is more likely that the traditional script galactic colonization?

Rather wild read, but perhaps worth a thought. Would that alternative trajectory affect your opinion of the prospect, XiXiDu?

On balance I'm not too happy with the history of existence. As Douglas Adams wrote, "In the beginning the Universe was created. This has made a lot of people very angry and has been widely regarded as a bad move." I'd rather not be here myself, so I find the creation of other sentients a morally questionable act. On the other hand, artificial intelligence offers a theoretical way out of this mess. Worries about ennui strike me as deeply misguided. Oppression, frailty, and stupidity makes hanging out in this world unpleasant, not any lack of worthwhile pursuits. Believe me, I could kill a few millennia no problem. If Kurzweil's dreams of abundance (in every sense) come true, I won't be complaining.

Now, the notion of a negative but nonfatal Singularity deserves consideration. The way I typically see things, there's either death or Singularity in the long run and both are good. Indefinite life extension without revolutionary economic and social change would be a nightmare, though perhaps better at every individual point than the pain of aging.

Your concerns about the ultimate fate of the universe are intriguing but too distant to arouse much emotion from me. Who knows what will happen then? Such entities might travel to other universes or forge their own. I'll just say that judging by the present record, intelligence and suffering go together. Whether we can escape this remains to be seen.

Thank you for posting this, Kaj. It's exactly what the community needs at this time. Far too many transhumanists accept the claims coming out of evolutionary psychology uncritically. Bravo!