2016 LessWrong Diaspora Survey Analysis: Part Four (Politics, Calibration & Probability, Futurology, Charity & Effective Altruism)
The LessWrong survey has a very involved section dedicated to politics. In previous analysis the benefits of this weren't fully realized. In the 2016 analysis we can look at not just the political affiliation of a respondent, but what beliefs are associated with a certain affiliation. The charts below summarize most of the results.
Political Opinions By Political Affiliation
There were also some other questions in this section which aren't covered by the above charts.
Calibration And Probability Questions
I just couldn't analyze these, sorry guys. I put many hours into trying to get them into a decent format I could even read and that sucked up an incredible amount of time. It's why this part of the survey took so long to get out. Thankfully another LessWrong user, Houshalter, has kindly done their own analysis.
All my calibration questions were meant to satisfy a few essential properties:
- They should be 'self contained'. I.E, something you can reasonably answer or at least try to answer with a 5th grade science education and normal life experience.
- They should, at least to a certain extent, be Fermi Estimable.
- They should progressively scale in difficulty so you can see whether somebody understands basic probability or not. (eg. In an 'or' question do they put a probability of less than 50% of being right?)
At least one person requested a workbook, so I might write more in the future. I'll obviously write more for the survey.
|Please give the obvious answer to this question, so I can automatically throw away all surveys that don't follow the rules: What is the probability of a fair coin coming up heads?||49.821||50.0||50.0||3.033|
|What is the probability that the Many Worlds interpretation of quantum mechanics is more or less correct?||44.599||50.0||50.0||29.193|
|What is the probability that non-human, non-Earthly intelligent life exists in the observable universe?||75.727||90.0||99.0||31.893|
|...in the Milky Way galaxy?||45.966||50.0||10.0||38.395|
|What is the probability that supernatural events (including God, ghosts, magic, etc) have occurred since the beginning of the universe?||13.575||1.0||1.0||27.576|
|What is the probability that there is a god, defined as a supernatural intelligent entity who created the universe?||15.474||1.0||1.0||27.891|
|What is the probability that any of humankind's revealed religions is more or less correct?||10.624||0.5||1.0||26.257|
|What is the probability that an average person cryonically frozen today will be successfully restored to life at some future time, conditional on no global catastrophe destroying civilization before then?||21.225||10.0||5.0||26.782|
|What is the probability that at least one person living at this moment will reach an age of one thousand years, conditional on no global catastrophe destroying civilization in that time?||25.263||10.0||1.0||30.510|
|What is the probability that our universe is a simulation?||25.256||10.0||50.0||28.404|
|What is the probability that significant global warming is occurring or will soon occur, and is primarily caused by human actions?||83.307||90.0||90.0||23.167|
|What is the probability that the human race will make it to 2100 without any catastrophe that wipes out more than 90% of humanity?||76.310||80.0||80.0||22.933|
Probability questions is probably the area of the survey I put the least effort into. My plan for next year is to overhaul these sections entirely and try including some Tetlock-esque forecasting questions, a link to some advice on how to make good predictions, etc.
This section got a bit of a facelift this year. Including new cryonics questions, genetic engineering, and technological unemployment in addition to the previous years.
Interestingly enough, of those who think it will work with enough confidence to say 'yes', only 14 are actually signed up for cryonics.
sqlite> select count(*) from data where CryonicsNow="Yes" and Cryonics="Yes - signed up or just finishing up paperwork";
sqlite> select count(*) from data where CryonicsNow="Yes" and (Cryonics="Yes - signed up or just finishing up paperwork" OR Cryonics="No - would like to sign up but unavailable in my area" OR "No - would like to sign up but haven't gotten around to it" OR "No - would like to sign up but can't afford it");
LessWrongers seem to be very bullish on the underlying physics of cryonics even if they're not as enthusiastic about current methods in use.
The Brain Preservation Foundation also did an analysis of cryonics responses to the LessWrong Survey.
By what year do you think the Singularity will occur? Answer such that you think, conditional on the Singularity occurring, there is an even chance of the Singularity falling before or after this year. If you think a singularity is so unlikely you don't even want to condition on it, leave this question blank.
Stdev: 2.847858859055733e+18I didn't bother to filter out the silly answers for this.
Obviously it's a bit hard to see without filtering out the uber-large answers, but the median doesn't seem to have changed much from the 2014 survey.
Well that's fairly overwhelming.
I find it amusing how the strict "No" group shrinks considerably after this question.
This question is too important to just not have an answer to so I'll do it manually. Unfortunately I can't easily remove the 'excluded' entries so that we're dealing with the exact same distribution but only 13 or so responses are filtered out anyway.
sqlite> select count(*) from data where GeneticImprovement="Yes";
>>> 1100 + 176 + 262 + 84
>>> 1100 / 1622
67.8% are willing to genetically engineer their children for improvements.
These numbers go about how you would expect, with people being progressively less interested the more 'shallow' a genetic change is seen as.
All three of these seem largely consistent with peoples personal preferences about modification. Were I inclined I could do a deeper analysis that actually takes survey respondents row by row and looks at correlation between preference for ones own children and preference for others.
Do you think the Luddite's Fallacy is an actual fallacy?
Yes: 443 (30.936%)
No: 989 (69.064%)
We can use this as an overall measure of worry about technological unemployment, which would seem to be high among the LW demographic.
By what year do you think the majority of people in your country will have trouble finding employment for automation related reasons? If you think this is something that will never happen leave this question blank.
Stdev: 1180.2342850727339Question is flawed because you can't distinguish answers of "never happen" from people who just didn't see it.
Interesting question that would be fun to take a look at in comparison to the estimates for the singularity.
Do you think the "end of work" would be a good thing?
Yes: 1238 (81.287%)
No: 285 (18.713%)
Fairly overwhelming consensus, but with a significant minority of people who have a dissenting opinion.
If machines end all or almost all employment, what are your biggest worries? Pick two.
|People will just idle about in destructive ways||513||16.71%|
|People need work to be fulfilled and if we eliminate work we'll all feel deep existential angst||543||17.687%|
|The rich are going to take all the resources for themselves and leave the rest of us to starve or live in poverty||1066||34.723%|
|The machines won't need us, and we'll starve to death or be otherwise liquidated||416||13.55%|
The plurality of worries are about elites who refuse to share their wealth.
Which disaster do you think is most likely to wipe out greater than 90% of humanity before the year 2100?
Nuclear war: +4.800% 326 (20.6%)
Asteroid strike: -0.200% 64 (4.1%)
Unfriendly AI: +1.000% 271 (17.2%)
Nanotech / grey goo: -2.000% 18 (1.1%)
Pandemic (natural): +0.100% 120 (7.6%)
Pandemic (bioengineered): +1.900% 355 (22.5%)
Environmental collapse (including global warming): +1.500% 252 (16.0%)
Economic / political collapse: -1.400% 136 (8.6%)
Other: 35 (2.217%)
Significantly more people worried about Nuclear War than last year. Effect of new respondents, or geopolitical situation? Who knows.
Charity And Effective Altruism
What is your approximate annual income in US dollars (non-Americans: convert at www.xe.com)? Obviously you don't need to answer this question if you don't want to. Please don't include commas or dollar signs.
How much money, in number of dollars, have you donated to charity over the past year? (non-Americans: convert to dollars at http://www.xe.com/ ). Please don't include commas or dollar signs in your answer. For example, 4000
How much money have you donated to charities aiming to reduce existential risk (other than MIRI/CFAR) in the past year?
How much have you donated in US dollars to the following charities in the past year? (Non-americans: convert to dollars at http://www.xe.com/) Please don't include commas or dollar signs in your answer. Options starting with "any" aren't the name of a charity but a category of charity.
|Against Malaria Foundation||483935.027||1905.256||300.0||None||7216.020|
|Schistosomiasis Control Initiative||47908.0||840.491||200.0||1000.0||1618.785|
|Deworm the World Initiative||28820.0||565.098||150.0||500.0||1432.712|
|Any kind of animal rights charity||83130.47||1093.821||154.235||500.0||2313.493|
|Any kind of bug rights charity||1083.0||270.75||157.5||None||353.396|
|Machine Intelligence Research Institute||141792.5||1417.925||100.0||100.0||5370.485|
|Any charity combating nuclear existential risk||491.0||81.833||75.0||100.0||68.060|
|Any charity combating global warming||13012.0||245.509||100.0||10.0||365.542|
|Center For Applied Rationality||127101.0||3177.525||150.0||100.0||12969.096|
|Strategies for Engineered Negligible Senescence Research Foundation||9429.0||554.647||100.0||20.0||1156.431|
|Any campaign for political office||38443.99||366.133||50.0||50.0||1374.305|
This table is interesting given the recent debates about how much money certain causes are 'taking up' in Effective Altruism.
Do you follow any dietary restrictions related to animal products?
Yes, I am vegan: 54 (3.4%)
Yes, I am vegetarian: 158 (10.0%)
Yes, I restrict meat some other way (pescetarian, flexitarian, try to only eat ethically sourced meat): 375 (23.7%)
No: 996 (62.9%)
Do you know what Effective Altruism is?
Yes: 1562 (89.3%)
No but I've heard of it: 114 (6.5%)
No: 74 (4.2%)
Do you self-identify as an Effective Altruist?
Yes: 665 (39.233%)
No: 1030 (60.767%)
The distribution given by the 2014 survey results does not sum to one, so it's difficult to determine if Effective Altruism's membership actually went up or not but if we take the numbers at face value it experienced an 11.13% increase in membership.
Do you participate in the Effective Altruism community?
Yes: 314 (18.427%)
No: 1390 (81.573%)
Same issue as last, taking the numbers at face value community participation went up by 5.727%
Has Effective Altruism caused you to make donations you otherwise wouldn't?
Yes: 666 (39.269%)
No: 1030 (60.731%)
Effective Altruist Anxiety
Have you ever had any kind of moral anxiety over Effective Altruism?
Yes: 501 (29.6%)
Yes but only because I worry about everything: 184 (10.9%)
No: 1008 (59.5%)
There's an ongoing debate in Effective Altruism about what kind of rhetorical strategy is best for getting people on board and whether Effective Altruism is causing people significant moral anxiety.
It certainly appears to be. But is moral anxiety effective? Let's look:
Sample Size: 244
Average amount of money donated by people anxious about EA who aren't EAs: 257.5409836065574
Sample Size: 679
Average amount of money donated by people who aren't anxious about EA who aren't EAs: 479.7501384388807
Sample Size: 249 Average amount of money donated by EAs anxious about EA: 1841.5292369477913
Sample Size: 314
Average amount of money donated by EAs not anxious about EA: 1837.8248407643312
It seems fairly conclusive that anxiety is not a good way to get people to donate more than they already are, but is it a good way to get people to become Effective Altruists?
Sample Size: 1685
P(Effective Altruist): 0.3940652818991098
P(EA Anxiety): 0.29554896142433235
P(Effective Altruist | EA Anxiety): 0.5
Maybe. There is of course an argument to be made that sufficient good done by causing people anxiety outweighs feeding into peoples scrupulosity, but it can be discussed after I get through explaining it on the phone to wealthy PR-conscious donors and telling the local all-kill shelter where I want my shipment of dead kittens.
What's your overall opinion of Effective Altruism?
Positive: 809 (47.6%)
Mostly Positive: 535 (31.5%)
No strong opinion: 258 (15.2%)
Mostly Negative: 75 (4.4%)
Negative: 24 (1.4%)
EA appears to be doing a pretty good job of getting people to like them.
|Affiliation||Income||Charity Contributions||% Income Donated To Charity||Total Survey Charity %||Sample Size|
|Community||Count||% In Community||Sample Size|
|LessWrong Facebook Group||83||48.256%||172|
|Effective Altruism Hub||86||86.869%||99|
|Good Judgement(TM) Open||23||74.194%||31|
|#lesswrong on freenode||19||24.675%||77|
|#slatestarcodex on freenode||9||24.324%||37|
|#chapelperilous on freenode||2||18.182%||11|
|One or more private 'rationalist' groups||91||47.15%||193|
|Affiliation||EA Income||EA Charity||Sample Size|