User Profile

star5041
description27
message803

Recent Posts

Curated Posts
starCurated - Recent, high quality posts selected by the LessWrong moderation team.
rss_feed Create an RSS Feed
Frontpage Posts
Posts meeting our frontpage guidelines: • interesting, insightful, useful • aim to explain, not to persuade • avoid meta discussion • relevant to people whether or not they are involved with the LessWrong community.
(includes curated content and frontpage posts)
rss_feed Create an RSS Feed
All Posts
personIncludes personal and meta blogposts (as well as curated and frontpage).
rss_feed Create an RSS Feed

[Link] Case Studies Highlighting CFAR’s Impact on Existential Risk

1y
4 points
Show Highlightsubdirectory_arrow_left
1

Results of a One-Year Longitudinal Study of CFAR Alumni

3y
33 points
31 min read
Show Highlightsubdirectory_arrow_left
35

The effect of effectiveness information on charitable giving

4y
15 points
1 min read
Show Highlightsubdirectory_arrow_left
0

Practical Benefits of Rationality (LW Census Results)

4y
16 points
17 min read
Show Highlightsubdirectory_arrow_left
5

Participation in the LW Community Associated with Less Bias

6y
31 points
11 min read
Show Highlightsubdirectory_arrow_left
50

[Link] Singularity Summit Talks

6y
8 points
1 min read
Show Highlightsubdirectory_arrow_left
3

Take Part in CFAR Rationality Surveys

6y
18 points
1 min read
Show Highlightsubdirectory_arrow_left
4

Meetup : Chicago games at Harold Washington Library (Sun 6/17)

6y
0 points
1 min read
Show Highlightsubdirectory_arrow_left
0

Meetup : Weekly Chicago Meetups Resume 5/26

6y
0 points
1 min read
Show Highlightsubdirectory_arrow_left
0

Meetup : Weekly Chicago Meetups

6y
2 points
Show Highlightsubdirectory_arrow_left
0

Recent Comments

His description of LW there is: "LW suggests (sometimes, not always) that Bayesian probability is the main tool for effective, accurate thinking. I think it is only a small part of what you need."

This seems to reflect the toolbox vs. law misunderstanding that Eliezer describes in the OP. Chapman i...(read more)

CFAR's 2013 description of its mission was "to create people who can and will solve important problems", via a community with a mix of competence, epistemic rationality, and do-gooding.

I expect that I (and many other users) would get more benefit out of this feature if it was more personalized. If I have personally upvoted a lot of posts by a user, then make that user's comments appear even larger _to me_ (but not to other readers). That way, the people who I like would be a "bigg...(read more)

I find this CFAR version of Focusing surprising. Good noticing. The exercise described here is one application of Focusing, for finding bugs. At CFAR workshops we do something like this in the class on Hamming problems. The CFAR class on Focusing is more similar to Conor's post and puts a lot more e...(read more)

Go to chrome://settings/, click "Advanced" at the bottom, unselect "Use a prediction service to help complete searches and URLs typed in the address bar" and maybe also "Use a prediction service to load pages more quickly".

One advantage of having both a weak intuitive model and a weak analytical model is that you can notice where there are mismatches in their predictions and flag them as places where you're confused. This helps with making predictions about specific cases. In cases where your intuitive naive physics a...(read more)

Scott mentioned that fact about superforecasters in his review; from what I remember the book doesn't add much detail beyond Scott's summary. One result is that while poor forecasters tend to give their answers in broad strokes – maybe a 75% chance, or 90%, or so on – superforecasters are more fin...(read more)

And mine is that it sounds like we did do much better than the average tech industry. Though maybe you/Scott/others have different intuitions than I do about how common it has been for tech folks to make a bunch of money on cryptocurrency. My impression from Scott's post was that we wouldn't differ ...(read more)