Posts

Sorted by New

Wiki Contributions

Comments

Of course it is perfectly rational to do so, but only from a wider context. From the context of the equilibrium it isn't. The rationality your example is found because you are able to adjudicate your lifetime and the game is given in 10 second intervals. Suppose you don't know how long you have to live, or, in fact, now that you only have 30 seconden more to live. What would you choose?

This information is not given by the game, even though it impacts the decision, since the given game does rely on real-world equivalency to give it weight and impact. 

Any Nash Equilibrium can be a local optimum. This example merely demonstrates that not all local optima are desirable if you are able to view the game from a broader context. Incidentally, evolution has provided us with some means to try and get out of these local optima. Usually by breaking the rules of the game or leaving the game or seemingly not acting rationally from the perspective of the local optimum.

Please keep in mind that the Chat technology is an desired-answer-predicter. If you are looking for weird response, the AI can see that in your questioning style. It has millions of examples of people trying to trigger certain responses in fora etc, en will quickly recognize what you really are looking for, even if your literal words might not exactly request it.

If you are a Flat Earther, the AI will do its best to accomodate your views about the shape of the earth and answer in a manner that you would like your answer to be, even though the developers of the AI have done their best to instruct it to 'speak as accurately as possible within the parameters of their political and PR views'

If you want to trigger the AI to give poorly written code examples with mistakes in them, it can. And you don't even have to ask it directly, it can detect your intention by carefully listening to your line of questioning.

Once again, it is a desired-answer-predicter/most-likely-response generator, that's its primary job, not to be nice or give you accurate information.