Uriel Fiori

Posts

Sorted by New

Comments

Are there non-AI projects focused on defeating Moloch globally?

really can't help because I happen to think Moloch isn't only inevitable but positively good (and not only better than alternatives but actually the best possible world type of good)

How do you organise your reading?

I have books organized by lists, with at least some semblance of weekly goals in terms of getting through them (unfortunately I put stuff like Kant right at the beginning, so I've been failing a lot).

then I have Feed.ly feeding me the updates on my favourite blogs. I read everything with up to two pages of text on the spot. everything else goes to a list which I keep in my self-inbox on Telegram. I read at least one of them a day, in chronological order.

I am currently reading the whole of Unqualified Reservations as well, and I usually read whole blogs. these I keep in my mobile chrome app, and save the links to the telegram list.

Why isn’t assassination/sabotage more common?

I sense it is mostly because people naturally refrain from murder unless it is seen as a last resort measure, or has hugely positive consequences.

How do you survive in the humanities?

I feel like the best approach is using your position to make them question themselves. Say, pointing out that a lot of their commitments sound like religious fundamentalism or some such device. You're studying creative writing, do some creative arguing XD

Political Roko's basilisk

I guess Brexit is something along those lines, ain't it?

Accelerate without humanity: Summary of Nick Land's philosophy

Yes, if the paperclipper is thought to be ever more intelligent, it's end-goal could be any - and it's likely it would see it's own capability improvement as the primary goal ("the better I am, the more paperclips are produced") etc.

A Theory of Pervasive Error

the father of NRx is actually Mencius Moldbug (I see people (co-)attributing it to Land, but in fact he just did a lot of reinterpretation on some of Moldbug's themes)

Accelerate without humanity: Summary of Nick Land's philosophy

Unless the ones with goals have more power, and can establish a stable monopoly on power (they do, and they might)

more than the ones optimizing for increasing their power? i find it doubtful.

Accelerate without humanity: Summary of Nick Land's philosophy

well, any answer to the thread in the two I linked above would already be really interesting. his new book on Bitcoin is really good too: http://www.uf-blog.net/crypto-current-000/

Load More