If someone uses different rules than you to decide what to believe, then things that you can prove using your rules won't necessarily be provable using their rules.
Yes, but the idea is that a proof within one axiomatic system does not constitute a proof within another.
Not particularly, no. In fact, there probably is no such method - either the parties must agree to disagree (which they could honestly do if they're not all Bayesians), or they must persuade each other using rhetoric as opposed to honest, rational inquiry. I find this unfortunate.
Regarding instrumental rationality: I've been wondering for a while now if "world domination" (or "world optimization", as HJPEV prefers) is feasible. I haven't entirely figured out my values yet, but whatever they turn out to be, WD/WO sure would be handy for achieving them. But even if WD/WO is a ridiculously far-fetched dream, it would still be a very good idea to know one's approximate chances of success with various possible paths to achieving one's values. I have therefore come up with the "feasibility problem." Basical...
What if the disagreeing parties have radical epistemological differences? Double crux seems like a good strategy for resolving disagreements between parties that have an epistemological system in common (and access to the same relevant data), because getting to the core of the matter should expose that one or both of them is making a mistake. However, between two or more parties that use entirely different epistemological systems - e.g. rationalism and empiricism, or skepticism and "faith" - double crux should, if used correctly, eventually lead ...
This is an interesting idea, although I'm not sure what you mean by
It can work without people understanding why it works
Shouldn't the people learning it understand it? It doesn't really seem much like learning otherwise.
That's a valid point - I suppose there's no harm as long as one is careful. Allowing any part of your map to gain too much autonomy, however - internalizing a belief-label - is something to avoid. That's not to say that identity is bad - there's nothing wrong with being proud that you're a fan of Lost, or of your sexual orientation, etc. There is, I believe, something wrong with being proud that you're an atheist/socialist/republican/absurdist/singularitarian (etc.).
Sorry about the text at the top, it's the wrong size for some reason. Does anybody know how to fix that?
I must admit to some amount of silliness – the first thought I had upon stumbling onto LessWrong, some time ago, was: “wait, if probability does not exist in the territory, and we want to optimize the map to fit the territory, then shouldn’t we construct non-probabilistic maps?” Indeed, if we actually wanted our map to fit the territory, then we would not allow it to contain uncertainty – better some small chance of having the right map, then no chance, right? Of course, in actuality, we don’t believe that (p with x probability) with probability 1. We do n...
I wrote an article, but was unable to submit it to discussion, despite trying several times. It only shows up in my drafts. Why is this, and how do I post it publicly? Sorry, I'm new here, at least so far as having an account goes - I've been a lurker for quite some time and have read the sequences.
Can anybody point me to some specific examples of this type of evolution? I'... (read more)