Nathan Young

Wiki Contributions


Wanted: Notation for credal resilience

I think the notation here feels unintuitive. I don't think I'd guess what it means from reading it. 

perhaps: 1 day 80% [5,20], lifetime 80% [2.5, 40] though as I say in the other comment that just feels like a different confidence interval.

Wanted: Notation for credal resilience

This % chance of change should fold back into your original forecast, but I like that there is something signalling the depth of your confidence. 

Though it's unclear to me if confidence intervals suggest this notation already. If you had less chance of moving your interval, then it would already be a smaller interval, right?

Wanted: Notation for credal resilience

I'm interested. 

In some way it's a lot like liquidity in a market. You are saying you'll buy $100 at 90% then $200 at 80% etc. Someone can't just come in and force you to bet $1m at 90%. You'd think they had more information than you.

[Beta Feature] Google-Docs-like editing for LessWrong posts

I mainly write on the EA forum, but I'd like to see articles which are in the editing mode all the time - ie anyone can edit. I wonder how big a jump that is from this.

I've written about it here

Glen Weyl: "Why I Was Wrong to Demonize Rationalism"

As I said above I struggled to follow the article and now can't be bothered to reread it.

But I agree that he disagrees with his previous conduct.

Feels like "I disagree with you but went about it the wrong way" is something we'd welcome from those who disagree with us, right?

Glen Weyl: "Why I Was Wrong to Demonize Rationalism"

I will try to explain what I know. I guess 90% accuracy on individual points so some of it will be wrong. 

Overview: I think Weyl was going on a process of changing his mind for a year or two. Remmelt and I have both and conversations with him. I imagine there are more conversations and maybe some some deep process we can't see.

I've talked to Weyl for an hour or so on twitter 3 or 4 times. I liked his book and like him personally, so spent some time teasing out his thoughts whenever I thought he was being unfair. eg here

Iirc I'd lightly pushed for a while for him to A) talk to some actual rationalists and B) Send documents with criticisms to ratinalists directly rather than post them as open letter. I think a document posted by Weyl to here would get a sober response. I've always felt Weyl was a sincere person, even if we disagreed and cares about AI risk etc. Also I genuinely like him, which makes it easier.

Four months ago, he wrote this 

"I have [thought about writing on LessWrong] but I am worried I would get the tone wrong enough that it would be a net harm. @RemmeltE has kindly been trying to mentor me on this.

and later to me

"Thanks for being so persistent with me about this. I do genuinely think that you’re basically right that my behavior here has been fundamentally hateful and against my principles, driven by feelings of guilt/shame and counterproductive to my own goals. I hope to have time 

Before going out on paternity leave to post an apology on LessWrong"

To me it felt as if he had a culturally different approach to AI risk than rationalists (he wants to get more people involved, and likes redistributing wealth and power) and also there was maybe hurt. This led him (in my opinon) to overextend in his criticisms, mingling what I thought were fair and unfair commentary. The article he shared here I thought was unfair and didn't deserve Weyl's support. I guess I hoped he might change his mind, but I was still surprised when it happened (which makes me wonder if there were other things going on). I was particularly surprised by the strength of the first and this subsequent apology.

Some thoughts suggestions:
- I found the apology article a bit hard to follow - I read it a couple of hours ago and I'm not sure I could explain it now
- Weyl seems to have done exactly what the rationalist part of me would want from him. If anything, it might be too much. I hope people are gracious to him for this. It probably cost him time, emotional energy, pride and possible the respect of some others.
- I still wonder what led to him being so averse to rationalism in the first place.
- I'd suggest if you're interested you thank him for the apology and talk to him on the subject.

I've struggle to write this accurately and non-arrogant/humbly so apologies if I've overcooked. Thanks to Neel for suggesting I give my thoughts.

Gravity Turn

Really well written, would recommend crossposting to the EA forum if it isn't already.

Load More