Artir
Artir has not written any posts yet.

Artir has not written any posts yet.

See here: https://www.lesswrong.com/posts/RsDwRmHGvf6GqaQkE/why-so-little-ai-risk-on-rationalist-adjacent-blogs?commentId=rumGEbYYnHBcRxx6c
Hi, I'm the author of Nintil.com. As of today I think the endorsement I gave to Yarvin's argument was too strong, and I have just amended the post to make that clear. I added the following:
[Edit 2022-06-14]: I think some overall points in Yarvin's essay are valid (the world is indeed uncertain and there are diminishing returns to intelligence), but AGIs would still have the advantage of speed and parallelism (Imagine the entirety of Google but no need for meetings, and where workweeks are ran at 100x speed). Even in the absence of superior intelligence, that alone leads to capacities beyond what a human or group thereof can accomplish. I don't know... (read more)
The asteroid case - it wouldn't be inevitable; it's just the knowledge that there are people out there substantially more motivated than me (and better positioned) to deal with it. For some activities where I'm really good (like... writing blogposts) and where I expect my actions to make more of an impact relative to what others would be doing I could end up writing a blogpost about 'what you guy should do' and emailing it to some other relevant people.
Also, you can edit your post accordingly to reflect my update!