AFAIK, the main obstacles are
Solutions to these problems like space elevators to get rid of the rocket equation, new medical fields built from scratch, artificial wombs, energy production, closed biosphere management count, in my book, as discontinuous tech progress if you want them short term (a century). This also looks like a 'big bang' project management approach in which we have very little time to gather feedback on unforeseen problems which an incremental and long term approach would provide. Big bang projects carry a very high risk of failure and wasted resources (and we are talking a lot of resources here.)
An incremental, long term approach would require sustained, heavy efforts over the course of several centuries which, again in my book, does not count as the foreseeable future.
Thanks to both of you for your pointers.
Is anyone aware of any article discussing scalability issues?
I agree that from an individual standpoint it is rational to sign up for cryonics but is it really a good idea for mankind in general to massively sign up for cryonics? Would it not create an awful drag on the economy that would delay or maybe even prevent mankind to acquire the technology necessary for reviving the "dead"?
From what I read on the business model of Alcor and CI, the costs of sustaining cryonisation are paid by the dividends/interests of a small capital constituted through life insurance. If more and more people enter cryonisation, there will be more and more of that capital that needs to find investment opportunities. Is it economically feasible to provide capital revenue for an ever growing amount of capital?
Has someone really considered a society where the number of the "dead" equals the number of the living? Has someone considered a society where the number of the "dead" equals ten times the number of the living?
I saw "economies of scale" mentioned a number of time. Have those economies of scale been quantified? Are they real or is it a magical word? What if we do not manage to grow the energy supply fast enough to ensure cryostasis for everyone?
Have the ethical implications of the risk posed to future generations potential economic and technological growth by massive cryonisation been considered? What if what I pay for my cryonisation would be better used in medical or AI research? What if that money could be used to accelerate the coming of "immortality" for my grand-children? What if my cryonisation slows down or prevents the coming of "immortality" for everyone (including myself but also my grand-children)?
Same here. This does not strike me as a good argument at all... We can reverse it to argue against signing up for cryonics :
"Even if I sign up for cryonics, there will still be some other worlds in wich I didn't and in wich "I" am dying of cancer."
"Even if don't sign up, there are still other worlds in wich I did."
Maybe there is something about me actually making the choice to sign up in this world altering/constraining the overall probability distribution and making some outcomes less and less probable in the overall distribution...
I am new to this side and I still have to search through it more thoroughfuly but I really don't think I can let that argument fly by without reaction. I appologize in advance if I make some really dumb mistake here.
Okay, I thought this over a little bit and I can see a point: the earlier I sign up the more there will be of future "me"s getting cryonised. I do not see how much it matters in the grand scheme of things (I am just choosing a branch , I am not destroying the branch in wich I choose not to sign up.) but I guess there can be something along the lines of "I can not do much about the past but my decisions can influence the 'future'" or "my responsability is about my future 'me's, I should not worry about the worlds I can not 'reach'"
The argument still sounds rather weak to me (and the many-world view a bit nihilistic, not that it makes it wrong but I find it rather weird that you manage to get some sort of positive drive from it.)