"Perhaps it is the fear of being too late that is causing you distress. Perhaps you fear that humanity is going to be destroyed because you didn't build an FAI soon enough. Perhaps you fear that your life will end some 10,000 years sooner than you'd like."
Humanity's alleged demise is not the only possible way he could be too late. I wonder where Eliezer would turn his attention if someone (or some group) solved the problems of FAI before him.
Eliezer has written a number of times about how comparing your intelligence and rationality to those around you is pointless (e.g. it's never good enough to be good in comparison, etc.). This philosophy has thus far been directed at comparing one's self to lower levels of cognition - but I don't see why it shouldn't work bottom up also. Learn from the levels above you, but do not lionize them. As we all aspire to embody the higher levels, I'm sure Jaynes must have also (an old vampire, but not old enough).
Eliezer: I don't think we should worry about our particular positions on the bell curve, or set goals for where we want to be. Don't fret over the possible limitations of your brain, doing so will not change them. Just work hard and try your best, always attempt to advance - push the limitations. Jaynes was struggling against his meat-brain too. It's human - you both surpassed the village idiots and college professors, now the difference in levels becomes more and more negligible with each step approaching the limit. Everybody is working with meat designed by the "idiot god". Push it to the limit, hate the limit, but don't be self-conscious about it.
We all wish we had gotten an earlier start on things. The importance of them is perhaps something you have to learn as you grow.
The backwards reasoning in this problem is the same as is used in the unexpected hanging paradox, and similar to a problem called Guess 2/3 of the Average. This is where a group of players each guess a number between 0 and 100, and the player whose guess is closest to 2/3 of the average of all guesses wins. With thought and some iteration, the rational player can conclude that it is irrational to guess a number greater than (2/3)100, (2/3)^2100, (2/3)^n*100, etc. This has a limit at 0 when n -> ∞, so it is irrational to guess any number greater than zero.
"I think correct strategy gets profoundly complicated when one side believes the other side is not fully rational."
Very true. When you're not playing with "rational" opponents, it turns out that this strategy's effectiveness diminishes after n=1 (regardless, the average will never be greater than 67), and you'll probably lose if you guess 0 - how can you be rational in irrational times? If everybody is rational, but there is no mutual knowledge of this, the same effect occurs.
The kick is this: even if you play with irrationals, they're going to learn - even in a 3rd grade classroom, eventually the equilibrium sets in at 0, after a few rounds of play. After the first round, they'll adjust their guesses, and each round the 2/3 mean will get lower until it hits 0. At that point, even if people don't rationally understand the process, they're guessing 0.
That's what equilibrium is all about - you might not start there, or notice the tendency towards it, but once it's achieved it persists. Players don't even need to understand the "why" of it - the reason for which they cannot do better.
That's a little offshoot, not entirely sure how well it relates. But back to the TIPD...
"do you really truly think that the rational thing for both parties to do, is steadily defect against each other for the next 100 rounds?
Yes, but I'm not entirely sure it matters. If that's where the equilibrium is, that's the state the game is going to tend towards. Even a single (D, D) game might irrevocably lock the game into that pattern.
It must have been intentional that all the Dystopia examples are almost one-to-one mappings of the real world? Except for the cognitive one. That one stands out as strange, perhaps intentionally - the message is that the world is fucked, and we've only one more chance as the last Dystopian calamity looms before us.
As to the assignment:
Economic Weirdtopia: The production economy is entirely automated. Supply is near infinite due to the constellation of this automation with asteroid mining. (The weird part is that the political will was somehow mustered to accomplish this.) Quite oddly, class inequalities are no longer sustainable - due to the occasional public slaughter of the rising bourgeoisie and power elites.
Sexual Weirdtopia: What you described as Utopia seems pretty damn weirdtopia to me.
Governmental Weirdtopia: Each person is a congressmen. "Leaders" are chosen by lot, or else elected on merit by representatives chosen by lot. Laws are written and interpretted by juries, who are themselves potentially open to prosecution for the verdicts which they render. Lawmakers can be charged criminally by the people for the laws they pass.
Technological Weirdtopia: The human race has turned into a civilization of AI flying around the solar system (in a Dyson sphere).
Cognitive Weirdtopia: "