But I didn't bite any of the counterarguments to the extent that it would be necessary to counter the 10^100.

I don't think this is very hard if you actually look at examples of long-term investment. Background: http://www.gwern.net/The%20Narrowing%20Circle#ancestors and especially http://www.gwern.net/The%20Narrowing%20Circle#islamic-waqfs

First things:

Businesses and organizations suffer extremely high mortality rates; one estimate puts it at 99% chance of mortality per century. (This ignores existential risks and lucky aversions like nuclear warfare, an... (Read more)(Click to expand thread. ⌘F to Expand All)Cmd/Ctrl F to expand all comments on this post

So to survive, any perpetuity has a risk of 0.01^120 = 1.000000000000001e-240.

The premises in this argument aren't strong enough to support conclusions like that. Expropriation risks have declined strikingly, particularly in advanced societies, and it's easy enough to describe scenarios in which the annual risk of expropriation falls to extremely low levels, e.g. a stable world government run by patient immortals, or with an automated legal system designed for ultra-stability.

ETA: Weitzman on uncertainty about discount/expropriation rates.

3Neotenic7y I stand corrected. Thank you Gwern.

A Rational Altruist Punch in The Stomach

by Neotenic 7y1st Apr 20131 min read54 comments

8


 

Robin Hanson wrote, five years ago

Very distant future times are ridiculously easy to help via investment.  A 2% annual return adds up to a googol (10^100) return over 12,000 years, even if there is only a 1/1000 chance they will exist or receive it. 

So if you are not incredibly eager to invest this way to help them, how can you claim to care the tiniest bit about them?  How can you think anyone on Earth so cares?  And if no one cares the tiniest bit, how can you say it is "moral" to care about them, not just somewhat, but almost equally to people now?  Surely if you are representing a group, instead of spending your own wealth, you shouldn’t assume they care much.

So why do many people seem to care about policy that effects far future folk?   I suspect our paternalistic itch pushes us to control the future, rather than to enrich it.  We care that the future celebrates our foresight, not that they are happy. 

 

In the comments  some people gave counterarguments. For those in a rush, the best ones are Toby Ord's. But I didn't bite any of the counterarguments to the extent that it would be necessary to counter the 10^100. I have some trouble conceiving of what would beat a consistent argument a googol fold.  

Things that changed my behavior significantly over the last few years have not been many, but I think I'm facing one of them. Understanding biological immortality was one, it meant 150 000 non-deaths per day. Understanding the posthuman potential was another. Then came the 10^52 potential lives lost in case of X-risk, or if you are conservative and think only biological stuff can have moral lives on it, 10^31. You can argue about which movie you'll watch, which teacher would be best to have, who should you marry. But (if consequentialist) you can't argue your way out of 10^31 or 10^52. You won't find a counteracting force that exactly matches, or really reduces the value of future stuff by

3 000 000 634 803 867 000 000 000 000 000 000 777 000 000 000 999  fold 

Which is way less than 10^52 

You may find a fundamental and qualitative counterargument "actually I'd rather future people didn't exist", but you won't find a quantitative one. Thus I spend a lot of time on X-risk related things. 

Back to Robin's argument: so unless someone gives me a good argument against investing some money in the far future (and discovering some vague techniques of how to do it that will make it at least one in a millionth possibility) I'll set aside a block of money X, a block of time Y, and will invest in future people 12 thousand years from now. If you don't think you can beat 10^100, join me. 

And if you are not in a rush, read this also, for a bright reflection on similar issues. 

 

 

8