Wiki Contributions

Comments

Something like that, yeah

I don't mean this only for group sizes, but good point, there could be a qualitative difference and simplifying is actually fundamentally changing the topic.

I don't know, I still feel like it helps me figure out the core of a problem. However, I agree that asking if a proposed solution scales is important for the types of issues I listed in the examples.

Your strategy for AI risk seems to be "Let's not build the sort of AI that would destroy the world", which fails at the first word:  "Let's".

I don't have a strategy, I'm basically just thinking out loud about a couple of specific points. Building a strategy for preventing that type of AI is important, but I don't (yet?) have any ideas in that area.

Time-slack isn't rewarded with status that much, I think. Whenever someone can say "yeah, whenever's fine" in response to somebody that can only make it for exactly 4.32 minutes every second full Moon but only in January, I rarely find that this person is awarded status, even implicitly. It's basically taken for granted. Which reinforces your point that high-slack people don't capture the upside that much.

And which, in turn, leads me to ask: is the status payoff enough even for a rough selection? I think not. To reliably select for high-slack people (and therefore create high-slack groups), even roughly, I think you need to explicitly require some X amount of slack (easy for time, difficult for emotions).

And, of course, to make the implicit explicit - which seems to be the point of your post.

Regarding 2 and 3: that's the main practical perk of reading LessWrong, or as I'm inclined to call it now, SoonerRight.

Thanks everyone!

In my experience (Zagreb), you have this same organic development which leads to very crowded buildings with drastically different styles (like massive apartment buildings "boxing in" houses), very little pedestrian space, few parks and green areas... Some pretty messy and inhospitable neighborhoods.

Also some really good ones, so I'm wondering if the main factor is "some person in charge of a building wants to ensure that it fits the neighborhood".

For me, probably 2. I read "How to become a hacker" several years ago and it shaped many of my career-related choices. The writing/reasoning style is very similar to the ratsphere, so I was not too surprised that I would also find you here.

I totally get the frustration, that's why I felt the disclaimer in the beginning was necessary!

As for the question of many students - yes, absolutely. Promoting EA is a smart and valuable goal, and will definitely produce more effect ("or you raise awareness in town, and try to explain to others that there are children drowning in some ponds nearby"). And, as you say, it's precisely what Singer is doing.

Regarding systemic change: I think that's a conversation stopper in many cases. People say "X is cool and everything, but what we REALLY need is systemic change". But that's, like, a really big task, and it seems to me that it just breeds inaction, as opposed to interventions. I wasn't going for an applause light, only a very narrow criticism of one specific analogy/argument.

I'm sorry you don't find it valuable. It's an argument that bugged me - I first heard it only a couple of years ago on a podcast completely unrelated to EA, accepted is as valid, but felt that something was off. I worked through my confusion and this is the result. Maybe everyone who hears it immediately thinks of all the criticism you listed, but I doubt it. 

Who benefits from the last sentence? I guess people like me, or whoever hears the analogy and accepts it without first analyzing it a bit.

It's not criticism of Singer in general either. Chris says that this analogy is only the beginning of his argument, and I totally agree (and I happen to agree with almost all of Singer's conclusions, at least those that I've read afterwards).

That is true, rarely do you get someone who intentionally wants to make you miserable. They usually just make you miserable as a side-effect of not caring enough, but as soon as you're sufficiently annoying, they do one of those two things.

Load More