Wiki Contributions


Omicron Variant Post #1: We’re F***ed, It’s Never Over

Thanks for providing this, it seems extremely important for trying to predict how the pandemic will play out.  So we should have been much more scared of COVID variants than we would be absent original antigentic sin.  As it was predictable that COVID, because it was new to humans, would mutate this means those, including myself, who had never heard of this force likely underestimated the expected harm of COVID.  Why wasn't this something that the official experts were talking about long ago?

The Emperor's New Clothes: a story of motivated stupidity

Me at age 25 (who didn't know he was autistic) "I will say the emperor is naked.  Other people will like me more after I have said the emperor is naked.  That girl who I asked out yesterday and who said, 'I'm busy maybe some other time' might now agree to go on a date with me.  I believe other people will like me more because I model other peoples' thinking on my own and I would have greater respect for someone else who says that the emperor is naked."

Me at age 54 (who does know he is autistic).  "I really, really want to say the emperor is naked.  I get this will cause most other people to think less of me.  I emotionally believe that I should not care about anyone who would think less of me for saying the emperor is naked, but I intellectually know this isn't true.  I'm also aware that most other people would have some natural trepidation against saying the emperor is naked that I, being very weird, have inverted.  This inversion can cause me to fail at social signaling games and hinder progress towards my goals.  But I so very much want to say he is naked that I'm going to do it unless I can convince myself that the costs of doing so are very high and being a tenured professor means I probably won't suffer too much by being honest in this case, and I have succeeded in having a few friends who would not abandon me for saying the emperor is naked.  Indeed one such friend has a blog post up saying that the emperor is not only naked but also mentally defective".

The Emperor's New Clothes: a story of motivated stupidity

The Emperor's New Clothes should be taught to autistic children who have IQs above, say, 90 with the lesson that "normal" people sometimes realize that the Emperor is naked, and sometimes come to truly believe he is clothed, but "normal" people almost always get very mad at anyone who correctly points out that the Emperor is naked.  Being autistic can give you the superpower of caring more about truth than social acceptability.  Use your power, but understand its personal cost.  

On a personal note, being autistic is likely why I had the "courage" to be one of the three at my college to speak on the record with a New York Times reporter about political correctness at my workplace.  A discussion with the reporter starts at 5.10 on this podcast, and this is the NYT article.

On Raising Awareness

I was going to suggest you try to reach EA people, but they might want to achieve AGI as quickly as possible since a friendly AGI would likely quickly improve the world.  While the pool is very small, I have noticed a strong overlap between people worried about unfriendly AGI people who have signed up for cryonics or who at least who think cryonics is a reasonable choice.  It might be worth doing a survey of computer programmers who have thought about AGI to see which traits correlate with being worried about unaligned AGI.

From a selfish viewpoint, younger people should want AGI development to go slower than older people do since, cryonics aside, the older you are the more likely you will die before an AGI has the ability to cure aging.

On Raising Awareness

Yes, although you want to be very careful not to attract people to the field of AGI who don't end up working on alignment but end up shortening the time to when we get super-human AGI.

James_Miller's Shortform

A human made post-singularity AI would surpass the intellectual capabilities of ETs maybe 30 seconds after it did ours.


My guess is that aliens have either solved the alignment issue and are post-singularity themselves, or will stop us from having a singularity.  I think any civilization capable of building spaceships will have  explored AI, but I could just lack the imagination to consider otherwise.

What’s the likelihood of only sub exponential growth for AGI?

Normally this is a good approach, but a problem with the UFOs are aliens theory is that there is a massive amount of evidence (much undoubtedly crap) the most important of which is likely classified top secret so you have to put a lot of weight on what other people (especially those with direct access to those with top secret security clearances) say they believe. 

James_Miller's Shortform

I think if UFOs are aliens they on net increase our chance of survival.  I mostly think Eliezer is right about AI risks, and if the aliens are here they clearly have the capacity to kill us but are not doing so and the aliens would likely not want us to create a paperclip maximizer.  They might stop us from creating a paperclip maxmizer by killing us, but then we would be dead anyway if the aliens didn't exist.  But it's also possible that the aliens will save us by preventing us from creating a paperclip maximizer.  

It's extremely weird that atomic weapons have not been used in anger since WW II, and we know that humanity got lucky on several occasions, UFOs seems to like to be around ships that have nuclear weapons and power so I assign some non-trivial probability to aliens having saved us from nuclear war. 

As to the probability assessment, this is my first attempt so don't put a lot of weight on it:  If no aliens 75% (my guess, I don't know Eliezer's) chance we destroy ourselves.  UFOs being aliens at 40%, and say 30% chance if this is true they would save us from killing ourselves and 3% chance they would choose to destroy us in a situation in which we wouldn't do it to ourselves.

What’s the likelihood of only sub exponential growth for AGI?

Yes.  I don't know you so please don't read this as an insult.  But if Sam Altman and Tyler Cowen take an idea seriously don't you have to as well.  Remember that disagreement is disrespect so you saying that UFOs should not be taken seriously is your saying that you have a better reasoning process than either of those two men.

On Raising Awareness

This:  https://www.effectivealtruism.org/articles/introduction-to-effective-altruism/

Load More