Nisan

Wiki Contributions

Comments

Jimrandomh's Shortform

Ok, I think in the OP you were using the word "secrecy" to refer to a narrower concept than I realized. If I understand correctly, if Alice tells Bob "please don't tell Bob", and then five years later when Alice is dead or definitely no longer interested or it's otherwise clear that there won't be negative consequences, Carol tells Bob, and Alice finds out and doesn't feel betrayed — then you wouldn't call that a "secret". I guess for it to be a "secret" Carol would have to promise to carry it to her grave, even if circumstances changed, or something.

In that case I don't have strong opinions about the OP.

plutonic_form's Shortform

Become unpersuadable by bad arguments. Seek the best arguments both for and against a proposition. And accept that you'll never be epistemically self-sufficient in all domains.

Jimrandomh's Shortform

Suppose Alice has a crush on Bob and wants to sort out her feelings with Carol's help. Is it bad for Alice to inform Carol about the crush on condition of confidentiality?

Animal welfare EA and personal dietary options

Your Boycott-itarianism could work just through market signals. As long as your diet makes you purchase less high-cruelty food and more low-cruelty food, you'll increase the average welfare of farm animals, right? Choosing a simple threshold and telling everyone about it is additionally useful for coordination and maybe sending farmers non-market signals, if you believe those work.

If you really want the diet to be robustly good with respect to the question of whether farm animals' lives are net-positive, you'd want to tune the threshold so as not to change the number of animals consumed (per person per year, compared to a default diet, over the whole community). One would have to estimate price elasticities and dig into the details of "cage-free", etc.

My take on higher-order game theory

Yep, I skimmed it by looking at the colorful plots that look like Ising models and reading the captions. Those are always fun.

My take on higher-order game theory

No, I just took a look. The spin glass stuff looks interesting!

My take on higher-order game theory

I think you're saying , right? In that case, since embeds into , we'd have embedding into . So not really a step up.

If you want to play ordinal games, you could drop the requirement that agents are computable / Scott-continuous. Then you get the whole ordinal hierarchy. But then we aren't guaranteed equilibria in games between agents of the same order.

I suppose you could have a hybrid approach: Order is allowed to be discontinuous in its order- beliefs, but higher orders have to be continuous? Maybe that would get you to .

Tears Must Flow

And as a matter of scope, your reaction here is incorrect. [...] Reacting to it as a synecdoche of the agricultural system does not seem useful.

On my reading, the OP is legit saddened by that individual turkey. One could argue that scope demands she be a billion times sadder all the time about poultry farming in general, but that's infeasible. And I don't think that's a reductio against feeling sad about an individual turkey.

Sometimes, sadness and crying are about integrating one's beliefs. There's an intuitive part of your mind that doesn't understand your models of big, global problems. But, like a child, it understands the small tragedies you encounter up close. If it's shocked and surprised, then it is still learning what the rest of you knows about the troubles of the world. If it's angry and outraged, then there's a sense in which those feelings are "about" the big, global problems too.

Yudkowsky and Christiano discuss "Takeoff Speeds"
Nisan2moΩ1327

I apologize, I shouldn't have leapt to that conclusion.

Yudkowsky and Christiano discuss "Takeoff Speeds"

it legitimately takes the whole 4 years after that to develop real AGI that ends the world. FINE. SO WHAT. EVERYONE STILL DIES.

By Gricean implicature, "everyone still dies" is relevant to the post's thesis. Which implies that the post's thesis is that humanity will not go extinct. But the post is about the rate of AI progress, not human extinction.

This seems like a bucket error, where "will takeoff be fast or slow?" and "will AI cause human extinction?" are put in the same bucket.

Load More