I'm skeptical about trying to build FAI, but not about trying to influence the Singularity in a positive direction. Some people may be skeptical even of the latter because they don't think the possibility of an intelligence explosion is a very likely one. I suggest that even if intelligence explosion turns out to be impossible, we can still reach a positive Singularity by building what I'll call "modest superintelligences", that is, superintelligent entities, capable of taking over the universe and preventing existential risks and Malthusian outcomes, whose construction does not require fast recursive self-improvement or other questionable assumptions about the nature of intelligence. This helps to establish a lower bound on the benefits of an organization that aims to strategically influence the outcome of the Singularity.
- MSI-1: 105 biologically cloned humans of von Neumann-level intelligence, highly educated and indoctrinated from birth to work collaboratively towards some goal, such as building MSI-2 (or equivalent)
- MSI-2: 1010 whole brain emulations of von Neumann, each running at ten times human speed, with WBE-enabled institutional controls that increase group coherence/rationality (or equivalent)
- MSI-3: 1020 copies of von Neumann WBE, each running at a thousand times human speed, with more advanced (to be invented) institutional controls and collaboration tools (or equivalent)
(To recall what the actual von Neumann, who we might call MSI-0, accomplished, open his Wikipedia page and scroll through the "known for" sidebar.)
Building a MSI-1 seems to require a total cost on the order of $100 billion (assuming $10 million for each clone), which is comparable to the Apollo project, and about 0.25% of the annual Gross World Product. (For further comparison, note that Apple has a market capitalization of $561 billion, and annual profit of $25 billion.) In exchange for that cost, any nation that undertakes the project has a reasonable chance of obtaining an insurmountable lead in whatever technologies end up driving the Singularity, and with that a large measure of control over its outcome. If no better strategic options come along, lobbying a government to build MSI-1 and/or influencing its design and aims seems to be the least that a Singularitarian organization could do.
Doing this very reliably seems more fantastical than the intelligence enhancement part.
Where do you get your numbers from? Why aren't [big number] of educated people a superintelligence now? If it's due to coordination problems, then you are sweeping the complexity of solving such problems under the rug.
Not only are there more people today than in von Neumann's time, but it is far easier to be discovered or to educate yourself. The general prosperity level of the world is also far higher. As a result, I expect, purely on statistical grounds, that there would be far more von Neumann level people today than in von Neumann's time. I certainly don't see a shortage of brilliant people in academia, for instance.
What is a test for a von Neumann level intelligence? Do you think "top people" in technical fields today would fail?
What is the current bottleneck on MS-1? Are we better off raiding Neumann's corpse, extracting the DNA and then implanting all the embryos we can make? Or are we better off with the current strategies of sequencing intelligent people to uncover the genetics of intelligence, which would then allow embryo selection or engineering? With the latter, the main bottleneck seems to be the cost of sequencing (since one needs a lot of genomes to discern the signal through all the noise), but that cost is being pushed down by the free market at a breathtaking pace - and indeed, the Beijing Genomics Institute (see Hsu, IIRC) is already working hard on the task of sequencing smart kids.
We can't clone humans at the moment. Even attempts to derive human stem cell lines from cloning have been disappointing, and reproductive cloning would face much higher barriers. Even if it could be made to work Dolly-style, you would still be producing huge numbers of miscarriages, early deaths, and damaged offspring for each success. That would not only increase the economic cost, but be incredibly unattractive for parents and a PR nightmare.
We can do embryo selection, but the relevant alleles would need to be identified in large studies (with the effectiveness of selection scaling with the portion of variation explained). The BGI study may expose a number of candidates, but I would expect the majority to be captured through linking genetic data collected for other reasons (or as part of comprehensive biobanks) to be matched to military or education... (read more)
If intelligence is 50% genetic, and Von Newman was 1 in a billion, the clones will be 1 in 500. Regression to the mean.
Can you expand your reasons?
Here are some posts/threads where I talk about my reasons: 1 2 3.
While the benefits are clear, it is not so clear that the project would in fact outrun the pace of progress as usual.
Cloning: It is unclear to which extent the truly exceptional ability is a result of just being lucky that the random parts of the development process resulted in right kind of circuitry. I'm not even taking of nature vs nurture. Those clones won't have same fingerprints, won't have same minor blood vessel patterns, etc etc. even if the wombs were exactly identical, as long as the thermal noise differs. See also : http://www.ncbi.nlm.nih.gov/... (read more)
Do we have reason to believe the average period research engineering couldn't do what Von Neumann did given the same materials and information?
How does more computational power help become more rational? Will this not simply increase the number of irrational decisions made within the group?
If we did reach MSI-3, then the second conjunct of this statement would become redundant.