SingularityUtopia

Posts

Sorted by New

Comments

AI Risk and Opportunity: A Strategic Analysis

Yes I did mention the fox... foxes are not particularly domesticated... anyway this "open" discussion is not very open now due to my negative Karma, it is too difficult to communicate, which I suppose is the idea of the karma system, to silence ideas you don't want to hear about, thus I will conform to what you want. I shall leave you to your speculations regarding AI.

AI Risk and Opportunity: A Strategic Analysis

Dear asr - The issue was the emotional worth in relation to thinking. Here is a better quote:

"Here’s the strange part: although these predictions concerned a vast range of events, the results were consistent across every trial: people who were more likely to trust their feelings were also more likely to accurately predict the outcome. Pham’s catchy name for this phenomenon is the emotional oracle effect."

Mitchell wrote: "These are all emotional statements that do not stand up to reason."

Perhaps reason is not best tool for being accurate?

PS. LessWrong is too slow: "You are trying to submit too fast. try again in 1 minute." ...and: "You are trying to submit too fast. try again in 7 minutes." LOL "You are trying to submit too fast. try again in 27 seconds."

AI Risk and Opportunity: A Strategic Analysis

Dear JoshuaZ, regarding this:

"Consider the uploaded individual that decides to turn the entire planet into computronium or worse, turn the solar system into a Matrioshka brain. People opt out of that how?"

I consider such a premise to be so unlikely it is impossible. It is a very silly premise for three reasons.

  1. Destroying the entire planet when there is a whole universe full of matter is insane. If insane people exist in the future post-intelligence-explosion upload-world then insane people will be dealt with thus no danger but insanity post-intelligence-explosion will be impossible, insanity is a consequence of stupidity, insanity will be extinct in the future.

  2. Earth destructive actions are stupid: see above explanation regarding insanity: it also explains how stupidity will be obsolete.

  3. People opt out by stating they want to opt out. I'm sure an email will suffice.

It isn't obvious to me that all wars stem from resource scarcity.

Sorry that it isn't obvious how scarcity causes war. I don't have time to explain so I will leave you with some consensual validation regarding Ray Kurzweil who seems to think the war-scarcity interrelationship is obvious:

"I've actually grown up with a history of scarcity — and wars and conflict come from scarcity — but information is quite the opposite of that." ~ Ray Kurzweil http://www.hollywoodreporter.com/risky-business/sxsw-2012-damon-lindelof-ray-kurzweil-297218

AI Risk and Opportunity: A Strategic Analysis

The only evidence I have is regarding my own perceptions of the world based upon my life knowledge, my extensive awareness of living. I am not trying to prove anything. I'm merely throwing my thoughts our there. You can either conclude my thoughts make sense or not. I think it is unintelligent to join the army but is my opinion correct? Personally I think it is stupid to die. People may agree my survival based definition of intelligence is correct or they may think death can be intelligent, such as the deaths of soldiers.

What type of evidence could prove "well-educated" army officers are actually dim-witted fools? Perhaps via the interconnectedness of causation it could be demonstrated how military action causes immense suffering for many innocent people thereby harming everyone because the world is more hostile place than a hypothetical world where all potential conflict was resolved intelligently via peaceful methods. The military budget detracts from the science budget thus perhaps scientific progress is delayed, although I do recognise the military does invest in sci-etch development I think the investment would be greater if out world was not based on conflict. In a world where people don't fight, there would be no need for secrecy thus greater collaboration on scientific endeavours thus progress could be quicker thus anyone supporting the army could be delaying progress in a small way thus officers are stupid because it is stupid to delay progress.

The intelligent thing is for me to draw my input into this debate to a close because it is becoming exceptionally painful for me.

AI Risk and Opportunity: A Strategic Analysis

It seems that you are using "intelligent" to mean something like "would make the same decisions SingularityUtopia would make in that context".

No, "intelligence" is an issue of survival, it is intelligent to survive. Survival is a key aspect of intelligence. I do want to survive but the intelligent course of action of not merely what I would do. The sensibleness, the intelligence of survival, is something beyond myself, it is applicable to other beings, but people do disagree regarding the definition of intelligence. Some people think it is intelligent to die.

Almost no one, regardless of intelligence opts for cryonics. Moreover, cryonics was first proposed in 1962 by Robert Ettinger, 9 years after Stalin was dead. It is a bit difficult to opt for cryonics when it doesn't exist yet.

And intelligent person would realise freezing a body could preserve life even if nobody had ever considered the possibility.

Quickly browsing the net I found this:

"In 1940, pioneer biologist Basil Luyet published a work titled "Life and Death at Low Temperatures""

http://www.cryocare.org/index.cgi?subdir=&url=history.txt

1940 was before Stalin's death, but truly intelligent people would not need other thinkers to inspire their thinking. The decay limiting factor of freezing has long been known. Futhermore Amazon sems to state Luyet's work "Life and Death at Low Temperatures" was published pre-1923: http://www.amazon.com/Life-death-at-low-temperatures/dp/1178934128

According to Wikipedia many works of fiction dealt with the cryonics issue well before Stalin's death:

Lydia Maria Child's short story "Hilda Silfverling, A Fantasy" (1886),[81] Jack London's first published work "A Thousand Deaths" (1899), V. Mayakovsky's "Klop" (1928),[82] H.P. Lovecraft's "Cool Air" (1928), and Edgar Rice Burroughs' "The Resurrection of Jimber-Jaw" (1937). Many of the subjects in these stories are unwilling ones, although a 1931 short story by Neil R. Jones called "The Jameson Satellite",[83]........

http://en.wikipedia.org/wiki/Cryonics#Cryonics_in_popular_culture

AI Risk and Opportunity: A Strategic Analysis

Dear gwern, it all depends on how you define intelligence.

Google translate knows lots of languages. Goggle is a great information resource. Watson (the AI) appears to be educated, perhaps Watson could pass many exams, but Google and Watson are not intelligent.

Regarding the few people who are rocket scientists I wonder if the truly rare geniuses, the truly intelligent people, are less likely to be violent?

Few people are. Officers can be quite intelligent and well-educated people. The military academies are some of the best educational institutions around, with selection standards more comparable to Harvard than community college. In one of my own communities, Haskell programmers, the top purely functional data structure guys, Okasaki, is a West Point instructor.

Officers in the army are actually very dim despite being "well-educated".

I wasn't trying to troll you regarding the term "Grunt" I was merely spelling out clearly the meaning behind the term, it (Grunt) is an insult to the intelligence of the solider, perhaps made because someone who thinks it is intelligent to join the army (being violent) is a dumb human only capable of grunting.

Maybe it is intelligent to be cannon fodder, but like I say it all depends on how you define intelligence. http://en.wikipedia.org/wiki/Cannon_fodder

AI Risk and Opportunity: A Strategic Analysis

http://www.wired.com/wiredscience/2012/03/are-emotions-prophetic/

"If true, this would suggest that the unconscious is better suited for difficult cognitive tasks than the conscious brain, that the very thought process we’ve long disregarded as irrational and impulsive might actually be more intelligent, at least in some conditions."

AI Risk and Opportunity: A Strategic Analysis

I am not presenting a scientific thesis. This is only a debate, and a reasonably informal one? I am thinking openly. I am asking specific questions likely to elicit specific responses. I am speculating.

asr, you wrote:

The word we usually use for intelligent violence is "ruthless" or "cunning" -- and many people are described that way. Stalin, for instance, was apparently capable of long hours of hard work, had an excellent attention to detail, and otherwise appears to have been a smart guy. Just also willing to have millions of people murdered.

My point regarding mindless violence verses ruthlessness or cunning is that ruthlessness or cunning do not specifically define intelligence or violence in the blatant way which the phrase "mindless violence" does. Saddam and Gaddafi were cunning in a similar way to Stalin but the deaths of Saddam and Gaddafi indicate their cunning was not intelligent, in fact it is very stupid to die so close to Singularitarian immortality.

I am not asserting this proves all violence is mindless thus violence decreases with greater intelligence. I am simply offering food for thought. It is not a scientific thesis I am presenting. I am merely throwing some ideas out there to see how people respond.

If Stalin was truly intelligent then I assume he opted for Cryonic preservation?

"...Stalin was injected with poison by the guard Khrustalev, under the orders of his master, KGB chief Lavrenty Beria. And what was the reason Stalin was killed?"

http://news.bbc.co.uk/1/hi/world/europe/2793501.stm

Regarding stupidity and the armed forces I have addressed this elsewhere: http://lesswrong.com/lw/ajm/ai_risk_and_opportunity_a_strategic_analysis/5zgl

Load More