Thanks Lumifer. The Prince is worth reading. However, tranferring his insights regarding princedoms to how to design and spread memeplexes in the 21st century does have its limits. Any more suggestions?
Can somebody point out text books or other sources that lead to an increased understanding of how to influence more than one person (the books I know address only 1:1, or presentations)? There are books on how to run successful businesses, etc, but is there overarching knowledge that includes successful states, parties, NGOs, religions, other social groups (would also be of interest for how to best spread rationality...). In the Yvain framework: given the Moloch as a taken, what are good resources that describe how to optimally influence the Moloch with many self-interested agents and for example its inherent game-theoretic problems as long as AI is not up to the task?
Heed the typical mind fallacy. Other people are not you. What you find interesting is not necessarily what others find interesting. Your dilemmas or existential issues are not their dilemmas or existential issues. For example, I don't find the question of "shall we enforce a police state" interesting. The answer is "No", case closed, we're done. Notice that I'm speaking about myself -- you, being a different person, might well be highly interested in extended discussion of the topic.
I strongly disagree and think it is unrelated to the typical mind fallacy. Ok, the word "interesting" was too unprecise. However, the argument deserves a deeper look in my opinion. Let me rephrase to: "Discussions of AI sometimes end, where they have serious implications regarding real life." Especially! if you do not enjoy to entertain the thought of a police state and increased surveillance, you should be worried if respected rational essayists come to conclusions that include them as an option. Closing your case when confronted with possible results from a chain of argumentation won't make them disappear. And a police state to stay with the example is either an issue for almost everybody (if it comes to existance) or nobody. Hence, this detached from and not about my personal values.
I conclude from the discussion that the term "rich" is too vague. The following is mine: I should be surprised to find many LWers who don't find themselves in the top percentage of the Global Richlist and who could not afford cryonics if they made it their lives' goal.
I meant especially in individual members such as described in the point "priorities." Somewhat along the lines that topics in LW are not a representative sample concerning which topics and conclusions are relevant to the individual. In other words: The imaginary guide I write for my children "how to be rational" very much differs from the guide that LW is providing.
Definitely. I am slightly irritated that I missed that. The line spacing and paragraph spacing still seems a bit off compared to other articles. Is there anything I am doing wrong?
They are, but I still would not wear them. (And no rings for men unless you are married or have been a champion in basketball or wrestling.)
Let's differentiate two cases in whom we may want to address: 1) Aspiring rationalists: That's the easy case. Take an awesome shirt, sneak in "LW" or "pi" somewhere, and try to fly below the radar of anybody who would not like it. A moebius strip might do the same, a drawing of a cat in a box may work but also be misunderstood. 2) The not-yet aspiring rationalist: I assume, this is the main target group of InIns. I consider this way more difficult, because you have to keep the weirdness points below the gain. And you have to convey interest in a difficult-to-grasp concept on a small area. And nerds are still less "cool" than sex, drugs, and sports. A Space X T-Shirt may do the job (rockets are cool), but LW concepts? I haven't seen a convincing solution, but will ask around. Until then, the best solution to me seems to dress as your tribe expects you to find other ways of spreading the knowledge.
Fair Enough. Maybe I should take Elon Musk out, he has in WBW found a way to push the value of advertising beyond his the cost of his time spent. If Zuckerberg posts to, I will be fully falsified. To compensate, I introduce typical person X whose personal cost-benefit analysis from posting an article is negative. I still argue that this is the standard.
Did anything come from this? Would love to see that, too!