LESSWRONG
LW

RobertWiblin
3826480
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
A LessWrong Crypto Autopsy
RobertWiblin7y10
1: Our epistemic rationality has probably gotten way ahead of our instrumental rationality

I would defend the instrumental rationality of having a rule of thumb that unless you're quite wealthy, you don't bother looking into anything that appears to be a 'get rich quick' scheme, or seek to invest in high-risk high-return projects you can't evaluate.

Yes sometimes it will fail big, if you miss the boat on bitcoin, or Facebook or whatever. Every strategy fails in some scenarios. Sometimes betting it all on 23 red will have been the right call.

But because it i) lowers risk, ii) saves you wasting time looking into lots of dud investments to find the occasional good one, iii) makes you less of a mark for scams and delusions, I think it's sensible for most.

Reply
A LessWrong Crypto Autopsy
RobertWiblin7y90

From a selfish point of view, I don't think most rationalists would benefit significantly from a bit of extra money, so it doesn't make much sense to be dedicating their truly precious resource (time and attention) to identifying high-risk high-return investments like bitcoin and in this case figuring out how to buy/store them safely. And I'm someone who bought bitcoin for the sake of entertainment.

From an altruistic point of view, yes I expect hundreds of millions of dollars to be donated, and the current flow is consistent with that - I know of 5 million in the last few months, and there's probably more than hasn't been declared.

"then it's no longer so plausible that "hundreds of millions is a substantial fraction as good as billions"."

At the full community level the marginal returns on further donations also declines, though more slowly: https://80000hours.org/2017/11/talent-gaps-survey-2017/#how-diminishing-are-returns-in-the-community

Reply
A LessWrong Crypto Autopsy
RobertWiblin7y190

Collectively the community has made hundreds of millions from cypto. But it did so by getting a few wealthy people to buy many bitcoin, rather than many people to buy a few bitcoin. This is a more efficient model because it avoids big fixed costs for each individual.

It also avoid everyone in the community having to dedicate some of their attention to thinking about what outstanding investment opportunities might be available today.

Due to declining marginal returns, hundreds of millions is a substantial fraction as good as billions. So I think we did alright.

Reply
Effective altruism is self-recommending
RobertWiblin8y140

"After they were launched, I got a marketing email from 80,000 Hours saying something like, "Now, a more effective way to give." (I’ve lost the exact email, so I might be misremembering the wording.) This is not a response to demand, it is an attempt to create demand by using 80,000 Hours’s authority, telling people that the funds are better than what they're doing already. "

I write the 80,000 Hours newsletter and it hasn't yet mentioned EA Funds. It would be good if you could correct that.

Reply
80,000 Hours: EA and Highly Political Causes
RobertWiblin8y30

"If we could somehow install Holden Karnofsky as president it would probably improve the lives of a billion people"

Amusingly, our suggestion of these two charities is entirely syndicated from a blog post put up by Holden Karnofsky himself: http://www.openphilanthropy.org/blog/suggestions-individual-donors-open-philanthropy-project-staff-2016

Reply
80,000 Hours: EA and Highly Political Causes
RobertWiblin8y110

Thanks for your interest in our work.

As we say in the post, on this and most problem areas 80,000 Hours defers charity recommendations to experts on that particular cause (see: What resources did we draw on?). In this case our suggestion is based entirely on the suggestion of Chloe Cockburn, the Program Officer for Criminal Justice Reform at the Open Philanthropy Project, who works full time on that particular problem area and knows much more than any of us about what is likely to work.

To questions like "does 80,000 Hours have view X that would make sense of this" or "is 80,000 Hours intending to do X" - the answer is that we don't really have an independent view on any of these things. We're just syndicating content from someone we perceive to be an authority (just as we do when we include GiveWell's recommended charities without having independently investigated them). I thought the article was very clear about this, but perhaps we needed to make it even more so in case people skipped down to a particular section without reading the preamble.

If you want to get these charities removed then you'd need to speak with Chloe. If she changes her suggestions - or another similar authority on this topic appears and offers a contrary view - then that would change what we include.

Regarding why we didn't recommend the Center for Criminal Justice Reform: again, that is entirely because it wasn't on the Open Philanthropy Project's list of suggestions for individual donors. Presumably that is because they felt their own grant - which you approve of - had filled their current funding needs.

All the best,

Rob

Reply
4 days left in Giving What We Can's 2015 fundraiser - £34k to go
RobertWiblin10y10

Yes, thanks so much to everyone who contributed! :)

Reply
Giving What We Can needs your help!
RobertWiblin10y00

Hi Eric - no they don't!

Reply
Giving What We Can needs your help!
RobertWiblin10y70

This fundraiser has been promoted on the Effective Altruism Forum already, so you may find your questions answered on the thread:

http://effective-altruism.com/ea/hz/please_support_giving_what_we_can_this_spring/

http://effective-altruism.com/ea/j9/giving_what_we_can_needs_your_help/

Reply
Six Ways To Get Along With People Who Are Totally Wrong*
RobertWiblin10y70

I'll re-post this comment as well:

"If I was going to add another I think it would be

  1. Have fun

Talking to people who really disagree with you can represent a very enjoyable intellectual exploration if you approach it the right way. Detach yourself from your own opinions, circumstances and feelings and instead view the conversation as a neutral observer who was just encountering the debate for the first time. Appreciate the time the other person is putting into expressing their points. Reflect on how wrong most people have been throughout history and how hard it is to be confident about anything. Don't focus just yet on the consequences or social desirability of the different views being expressed - just evaluate how true they seem to be on their merits. Sometimes this perspective is described as 'being philosophical'."

Reply
Load More
64 days left in Giving What We Can's 2015 fundraiser - £34k to go
10y
4
31Giving What We Can needs your help!
10y
6
38Six Ways To Get Along With People Who Are Totally Wrong*
10y
43
78Could you be Prof Nick Bostrom's sidekick?
11y
47
15The Centre for Effective Altruism is hiring to fill five roles in research, operations and outreach
11y
7
27The principle of ‘altruistic arbitrage’
13y
7