That's helpful input, thanks. After reading the link and searching the wiki I suspect that it is more likely an akrasia/urges v. goals sort of thing based upon my reaction to noticing the inconsistency. I felt a need to bring my actions in line with my professed beliefs.
Very interesting. I have transhumanist beliefs that I claim to hold. My actions imply that I believe that I believe, if I understand this properly.
A prime example would be how I tend to my health. There are simple rational steps I can take to increase my odds of living long enough to hit pay dirt. I take okay care of myself, but could do better. Much better.
Cryonics may be another example. More research is required on my part, but a non-zero last stab is arguably better than nothing. I am not enrolled. It feels a bit like Pascal’s Wager to me. Perhaps it...
This is a well supported article with real life applications. Even better it shines a spotlight on holes in my thinking. I appreciate it when I read something that makes me want to slap my forehead and yell, "well, of COURSE!"
Thank you for your time putting this together.
Thanks! I consider myself more self aware than most, largely because I have done work similar to what is proposed in the Luminosity sequence myself. Of course interesting arguments could be had about how subjective the experience is, what ‘self’ I am even trying to be aware of (would that just be semantic?), but the result was a positive net gain in my quality of life. I'm curious to try the work with different techniques, though.
It will be interesting to see if the concept I hold of myself as pretty self-aware survives around here. All part of the process, I suppose.
As far as the math... If I don't try I definitely won't learn it. It will be a struggle, though.
Howdy,
tl;dr This seems like a place that I can use to shore up some of my cognitive shortcomings, eliminate some bias and expand my worldview. Maybe I can help someone else along the way.
I have been reading the material here for the last several days and have decided that this is a community that I would like to be a part of and hopefully contribute to. My greatest interests are improving my map of the territory(how great is that analogy?), using my constantly improving map to be a better husband and father, and exploring transhumanist ideas and conceits...
I’m inclined to disagree. While I am far from a weapons grade philosopher it seems to me that if we can rationally assign suffering any negative value then the suffering of a sentient being is a worse thing.
Say a gold fish is imprisoned in a fish bowl allowed to starve to death. Say a human being endures the same thing. The gold fish will die in a poor fashion (are there good ones?) and will suffer greatly. The human, by virtue of intellect, can suffer in ways that the gold fish cannot. The human can rail against the injustice of their situation. The huma... (read more)