Eliezer, you might do well to thoroughly understand and consider Fare's criticisms of you. He seems to be one of the few readers here who isn't in awe of you, and has therefore managed nailed some fundamental mistakes you're making about human values. Mistakes that have been bugging me for some time too, but that I haven't been able to articulate, possibly because I'm too enchanted by your cleverness.
We don't value strangers on par with our friends and family, let alone freaky baby-eating or orgy-having aliens. Furthermore, I don't want to be altered in such a manner as to make me so empathetic that I give equal moral standing to strangers and kin. I believe THAT would make me less human. If you or an FAI alters me into such a state, you are not correctly extrapolating my volition, nor of who knows how many other people like me. Do you have an estimate of how many people like that there are? How did you come by such an estimate?
So anyway, if this happened in any real future, I have no doubt some star would soon get supernova'd-- current star, Huygens, Happy Homeworld, Eater Homeworld, or Sol, in that order of likelihood. For these idealized humans inhabiting the uncanny valley of empathy that creates the whole contrived dilemma in the first place, who knows? Maybe the fact that a nova was what brought them there, and now they're contemplating creating a supernova is some kind of clue. Maybe the definition of "non-sentient baby" can be stretched to the point where the story ends as a blowjob joke, but I doubt it. Also, the mechanics of exactly how other people's pain affects the Happies haven't really been examined. It sounds like they're merely extrapolating the pain they think others must be feeling... given that they've had no scruples against engineering other sources of discomfort out of existance, why not engineer that out of existance too?
Come to think of it, perhaps I should explain why I think there will be a long-term downward trend.
Given that every large, complex civilization before ours has eventually collapsed, the burden of proof should be on those who claim that ours is exceptional and will not collapse. If I'm wrong and our civilization outlives me, I don't mind losing some money on my investments. I'd rather be a poor guy in a rich civilization if this insures me against being a poor guy in a poor civilization. If I'm right, I want to at least use the fact that I'm right to extract some wealth as a consolation prize and use that wealth to help myself and other techno-geeks survive and perhaps somehow hasten the next civilizational cycle.
If I had to guess, the most likely cause of collapse would be dwindling oil and gas supply combined with a growing demand caused by population growth and increasing industrialization. It's true that this creates incentives to develop alternative energy sources, but there are organizational, informational, and physical limitations on how quickly our infrastructure can be retooled to such alternatives. It is prudent to hedge one's bets that engineers and entreprneurs will win this race. Furthermore, every single alternative has a higher per-kilojoule cost than oil. Therefore, even if liquified coal, or biodiesel, or fuel cells become commercially viable, we will still be spending a larger fraction of our wealth just keeping the lights on than we do now, and I would expect the effect on the economy to be similar to a tax corresponding to the same amount.
But that's just my guess. I don't claim to know why our civilization will collapse, nor when this will happen. I do know that this collapse will not happen overnight, and there may be a long time window during which to exploit it before "the market is at 0".
WOW! Equity Private, this is the most lucid and useful analysis of a stagnation investment strategy I've seen. Thank you so much for posting.
How would your strategy be different if the goal was to get a modest return in a stagnating market, a larger return in a market crash, and a loss in the event of sustained growth? Do you think there is a way to guard against transient bubbles?
Deliberately worsening an economic slump carries with it the moral burden of possibly slowing down technological advances that would otherwise save the lives of millions of people currently doomed to die of old age.
Also, a sufficiently bad economic slump will diminish hope of successful cryopreservation for those still alive and permanently eliminate hope of revival for those already cryosuspended by causing either cryonics facilities or their suppliers to go out of business.
Michael Vassar: I agree about nano-goo and papercip scenarios having too many unknowns to be planned for at this time except in very broad terms. Regarding economic collapse, what do you think of the following statements...
Eliezer: I'm glad you're finally willing to at least consider the possibility that the world can end up someplace that is neither paperclips nor singulariparadise. A few years back a friend of mine asked you on the #sl4 IRC channel "what do you need to continue your work in the event of collapse" and your response bordered on dismissive. What got you to take long-term economic slump seriously as an existential threat?
Jonathan El-Bizri: on what grounds do you assume we will have GM biofuels online in less than ten years? How much per gallon in 2009 dollars do you expect such a fuel to cost? Why?
Jes: Mad Max scenarios happen all the time. Daily life in much of sub-Saharan Africa is one big Mad Max scenario. Afghanistan is a Mad Max scenario (and was before the US invaded, and still was before the Soviets invaded). What the hunker-in-a-bunker crowd gets wrong is the transition phase: they neglect to think about what needs to happen BETWEEN now and the final descent into chaos and barbarism. What are the upper and lower bounds on how long this transition phase takes? What might be some indicators that the transition is occurring and how far along it is? Guns and canned food only become valuable asset in the late stages. Nobody seems to even ask what's a good investment in the early and middle stages of descent. If only they did, they might end up with more spending power to use on said guns and canned food in the later stages. Thoughts?
As for me, I've been paying attention to some bearish index-tracking ETFs, whose value is inversely related to the performance of the target index (e.g. SRS, SKF). Certain alternative energy companies might also be good investments, as might be railroads and their suppliers. Coal and "green" coal processing. But of course investing in individual companies or even sector tracking ETFs requires a lot more research, caveat emptor.
Here's an odd bias I notice among the AI and singularity crowd: a lot of us seem to only plan for science-fictional emergencies, and not for mundane ones like economic collapse. Why is that? Anybody else notice this?
@Sigivald: You're right. A sufficiently severe economic downturn will kill Alcor and CI dead, along with all those currently in cryostasis. Economic/political/infrastructure instability is the biggest "existential risk" for cryonicists, but nobody can be arsed to prepare contingency plans for it because I guess it's doesn't have the sexy science-fictiony cachet of asteroid hits or grey goo.
Reverse absurdity bias anybody?
Disclaimer: I am signed up.
Simon: "Eliezer tries to derive his morality from human values"
I would correct the above to "Eliezer tries to derive his morality from stated human values."
That's where many of his errors come from. Everyone is a selfish bastard. But Eliezer cannot bring himself to believe it, and a good fraction of the sorts of people whose opinions get taken seriously can't bring themselves to admit it.