Filipe

Filipe's Comments

Open thread, 11-17 August 2014

Economist Scott Sumner at Econlog praised heavily Yudkowsky and the quantum physics sequence, and applies lessons from it to economics. Excerpts:

I've recently been working my way through a long set of 2008 blog posts by Eliezer Yudkowsky. It starts with an attempt to make quantum mechanics seem "normal," and then branches out into some interesting essays on philosophy and science. I'm nowhere near as smart as Yudkowsky, so I can't offer any opinion on the science he discusses, but when the posts touched on epistemological issues his views hit home.

and

I used to have a prejudice against math/physics geniuses. I thought when they were brilliant at high level math and theory; they were likely to have loony opinions on complex social science issues. Conspiracy theories. Or policy views that the government should wave a magic wand and just ban everything bad. Now that I've read Robin Hanson, Eliezer Yudkowsky and David Deutsch, I realize that I've got it wrong. A substantial number of these geniuses have thought much more deeply about epistemological issues than the average economist. So when Hanson says we put far too little effort into existential risks, or even lesser but still massive threats like solar flares, and Yudkowsky says cryonics is under-appreciated, or when they say AI (or brain ems) is coming faster than we think and will have far more profound effects than we realize, I'm inclined to take them very seriously.

Is my view contrarian?

Even though he calls it "The Smart Vote", the concept is a way to figure out the truth, not to challenge current democratic notions (I think), and is quite a bit more sophisticated than merely giving greater weight to smarter people's opinions.

Is my view contrarian?

Garth Zietsman, who according to himself, "Scored an IQ of 185 on the Mega27 and has a degree in psychology and statistics and 25 years experience in psychometrics and statistics", proposed the statistical concept of The Smart Vote , which seems to resemble your "Mildly extrapolate elite opinion". There are many applications of his idea to relevant topics on his blog.

It's not choosing the most popular answer among the smart people in any (aggregation of) poll(s), but comparing the proportion of the most to the less intelligent in any answer, and deciding The Smart Vote is that which has the largest ratio, after controlling for possible interests.

Huffington Post article on DeepMind-requested AI ethics board, links back to LW [link]

A blog connected to the NYT also linked to the interview.

Mr. Legg noted in a 2011 Q&A with the LessWrong blog that technology and artificial intelligence could have negative consequences for humanity.

Gauging interest for a Rio de Janeiro meetup group.

I'm from Rio. You may PM me if there's enough interest.

How to deal with someone in a LessWrong meeting being creepy

Is there an actual history of people complaining about 'creepy behavior' in LW meetups? Or is this just one of those blank-statey attempts to explain the gender ratio in High-IQ communities due to some form of discrimination, without any evidence?

[This comment is no longer endorsed by its author]Reply
How to Find a Personal Assistant to Produce More? Transhumanism

I'm sure it is correlated. One might find even correlations with other things such as race and gender... I questioned the fairness in using it as a way to recruit people.

Load More