User Profile

star4
description9
message312

Recent Posts

Curated Posts
starCurated - Recent, high quality posts selected by the LessWrong moderation team.
rss_feed Create an RSS Feed
Frontpage Posts
Posts meeting our frontpage guidelines: • interesting, insightful, useful • aim to explain, not to persuade • avoid meta discussion • relevant to people whether or not they are involved with the LessWrong community.
(includes curated content and frontpage posts)
rss_feed Create an RSS Feed
Personal Blogposts
personPersonal blogposts by LessWrong users (as well as curated and frontpage).
rss_feed Create an RSS Feed

80,000 Hours: EA and Highly Political Causes

1y
6 min read
Show Highlightsubdirectory_arrow_left
25

[Link] Dominic Cummings: how the Brexit referendum was won

1y
Show Highlightsubdirectory_arrow_left
69

Rationality Considered Harmful (In Politics)

1y
2 min read
Show Highlightsubdirectory_arrow_left
20

A Review of Signal Data Science

2y
2 min read
Show Highlightsubdirectory_arrow_left
15

Inverse cryonics: one weird trick to persuade anyone to sign up for cryonics today!

2y
2 min read
Show Highlightsubdirectory_arrow_left
36

Request for help: Android app to shut down a smartphone late at night

3y
1 min read
Show Highlightsubdirectory_arrow_left
30

(misleading title removed)

3y
2 min read
Show Highlightsubdirectory_arrow_left
8

Recent Comments

> I don't think it will be very difficult to impart your intentions into a sufficiently advanced machine

Counterargument: it will be easy to impart an approximate version of your intentions, but hard to control the evolution of those values as you crank up the power. E.g. evolution, humans, make us...(read more)

I think 50% is a reasonable belief given the very limited grasp of the problem we have.

Most of the weight on success comes from FAI being quite easy, and all of the many worries expressed on this site not being realistic. Some of the weight for success comes from a concerted effort to solve hard ...(read more)

I guess there is a gap between the OP's intention and his/her behaviour? Intended to link to something but actually just self-links?

Thanks for your comment! Can you say which country?

> Could you tell me how you came about the list of African backward values?

Not in particular, the human brain tends to collect overall impressions rather than keep track of sources.

> I'd like the names of all the values I'd need to instil t...(read more)

Yeah, I mean maybe just make them float to the bottom?

One problem here is that we are trying to optimize a thing that is broken on an extremely fundamental level.

Rationality, transhumanism, hardcore nerdery in general attracts a lot of extremely socially dysfunctional human beings. They also tend to skew towards a ridiculously biologically-male-heavy...(read more)

agree with that isn't just “+1 nice post.” Here are some strategies...

How about the strategy of writing "+1 nice post"? Maybe we're failing to see the really blatantly obvious solution here....

+1 nice post btw

> someone was accidentally impregnated and then decided not to abort the child, going against what had previously been agreed upon, and proceeded to shamelessly solicit donations from the rationalist community to support her child

They were just doing their part against dysgenics and should be com...(read more)

word is going around that Anna Salamon and Nate Soares are engaging in bizarre conspiratorial planning around some unsubstantiated belief that the world will end in ten years

Sounds interesting, I'd like to hear more about this.

My impression of the appeal of LW retrospectively is that it (on average) attracted people who were or are under performing relative to g (this applies to myself). When you are losing you increase variance. When you are winning you decrease it.

This also applies to me