Stupid mathematical nitpick:
> The chances of this happening are only .95 ^ 39 = 0.13, even before taking into account publication and error bias.
Actually, it is more correct to say that .95 ^ 39 = 0.14.
If we calculate it out to a few more decimal places, we see that .95 ^ 39 is ~0.135275954. T...(read more)
What you are observing is part of the phenomenon of meta-contrarianism. Like everything Yvain writes, the aforementioned post is well worth a read.
Hmm. To me it seemed intuitively clear that the function would be monotonic.
In retrospect, this monotonicity assumption may have been unjustified. I'll have to think more about what sort of curve this function follows.
>> or they could even restrict options to typical government spending.
JoshuaFox noted that the government might tack on such restrictions
That said, it's not so clear where the borders of such restrictions would be. Obviously you could choose to allocate the money to the big budget items, like he...(read more)
Even formalisms like AIXI have mechanisms for long-term planning, and it is doubtful that any AI built will be merely a local optimiser that ignores what will happen in the future.
As soon as it cares about the future, the future is a part of the AI's goal system, and the AI will want to optimize o...(read more)
No, no, no: He didn't say that you don't have permission if you *don't* steal it, only that you *do* have permission if you do.
What you said is true: If you take it without permission, that's stealing, so you have permission, which means that you didn't steal it.
However, your argument falls apa...(read more)
You could use some sort of cloud service: for example, Dropbox. One of the main ideas behind of Dropbox was to have a way for multiple people to easily edit stuff collaboratively. It has a very easy user interface for such things (just keep the deck in a synced folder), and you can do it even witho...(read more)
By observing the lack of an unusual amount of paperclips in the world which Skynet inhabits.
I have some rambling thoughts on the subject. I just hope they aren't too stupid or obvious ;-)
Let's take as a framework the aforementioned example of the last digit of the zillionth prime. We'll say that the agent will be rewarded if it gets it right, on, shall we say, a log scoring rule. This me...(read more)
If a comment has 100% upvotes, then obviously the amount of upvotes it got is exactly equal to the karma score of the post in question.