And there is no one less suited than a corpse for time-sensitive emergency situations.
This is very false.
I have personally experienced people who outright panic in emergency situations, and a corpse would be far more helpful than a panicking person.
Do you have thoughts on how this can this be squared with trying to have principles and trying to follow something like functional decision theory?
Consider the following (real) example:
I'm going to a building with an emergency fire exit, that I'm not supposed to open. Opening it doesn't start an alarm and it's the only door that directly leaders outside, so I want to open it to get some fresh air.
To me this seems like a clear example of a rule one should break, if one is slightly naughty. The problem is that to be allowed in the building I agreed to follow the house rules.
Opening the door feels almost like lying, and I'd rather not lie, I don't think a FDT agent would open the door.
My best guess at a resolution is that I can open the window if the owner would allow me in knowing I'll occasionally open it. If he's fine with me doing it occasionally although he'd prefer that I never do it, I am allowed to open the window. But if he would throw me out if only, he knew that I broke the rule, then it doesn't seem right to open the window. It's not what a FDT agent would do. Even if I know I won't be caught and I'm sure no problems will be caused by me opening the window.
But this seems like a very non-naughty way to act. I think most people would just open the window, if they really knew that they won't be caught.
Or take this extreme example:
If I could press a button that made it impossible for me to lie, and this was common knowledge I'd press this button, even though it would probably make me much less naughty, but it would be worth it because of all the trust I'd gain. If I think about what I would be most annoyed at it's situations where I'm expected to lie. If I'm asked at the airport if I belong to any "extremist ideologies" maybe I can get away with saying no because they really mean something like "violent ideologies", and they don't care about extreme believes I have about ethics or AI. But surely there are other situations where lying to a bureaucracy is just expected, and not being able to lie would be very annoying.
But everyone is already naughty in these situations, so I'm guessing you're suggestions goes beyond lying in these kinds of situations.
while i dislike the word naughtiness and its connotations, i don't see the contradiction. as the post state int he start "They delight in breaking rules, but not rules that matter. ". all Decision Theory rules are rules that matter.
this doesn't answer the question of what rules to break and what not. but you cant be Lawful by following all laws in our way-to bureaucratic society.
my mother think that going outside while wearing pajamas is unthinkable. it can probably be described as naughty. it's also totally harmless.
but also, now that i thought about you comment it become clear that i actually don't know what is even the meaning of "Be Naughty".
i think this is the kind of post that would benefit greatly from three examples at least.
Similar advice is given in Taleb's Antifragile.
A fragile system, with a naughty or mischievous actor inside it, will fall apart. Maybe fragile systems should be encouraged to fall apart.
An antifragile system is one which benefits from a bit of chaos and randomness. Naughty and mischievous actors in these systems are beneficial, injecting the randomness needed for growth and for avoiding decay.
The quote is about Sam Altman who now leads OpenAI and there are some moral problems with the way OpenAI is lead by him.
Nice, I'm glad you wrote this bc I've been annoyed at how goody two shoes people in the EA community seem. It feels like even though they are smart, hardworking, and integrous(?), they do not have sufficient agency. The last rule they broke was taking food from the cafeteria, not actual naughtiness
Context: Post #10 in my sequence of private Lightcone Infrastructure memos edited for public consumption.
This one, more so than any other one in this sequence, is something I do not think is good advice for everyone, and I do not expect to generalize that well to broader populations. If I had been writing this with the broader LessWrong audience in mind I would have written something pretty different, but I feel like for the sake of transparency I should include all the memos on Lightcone principles I have written, and this one in particular would feel like a bad one to omit.
In "What We Look for in Founders" Paul Graham says:
The world is full of bad rules, and full of people trying to enforce them. Not only that, it's commonplace to combine those rules with memes and social pressure to get you to internalize those rules as your own moral compass.
A key tension I repeatedly notice in myself as I am interfacing with institutions like zoning boards, or university professors asking me to do my homework, or not too infrequently Effective Altruists asking me to be vegan, is that together with the request to not do something, comes a request to also adopt a whole stance of "good people do not do this kind of thing".
Moral argument, of course, is important and real. And social institutions rely on shared norms and ethical codes. But nevertheless, almost all rules such invoked are not worthy of the guilt they produce when internalized. Their structure and claim is often easily exposed as flimsy, their existence is often well-explained by an attempt at rent-seeking or other forms of power-preservation – or simply a reinforced historical accident or signaling competition – but not genuine moral inquiry or some other kind of functional search over the space of social rules.
A name I considered for today's principle is "have courage". The kind of courage Neville displayed when he tried to prevent Harry and his friends from going to the forbidden third floor against Dumbledore's warnings, and the kind Harry and his friends displayed when they barged right past him anyways. "Courage" as such, is having the strength of will to break the rules that deserve to be broken.
But I ultimately didn't like "courage", and preferred Paul Graham's "naughtiness"[1]. Courage implies a fear to be overcome, or the presence of some resistance, when I do think the right attitude is often to miss a mood completely, and to take active joy in violating bad rules. The right attitude towards requests to avoid blasphemy is not hand-wringing and a summoning of courage every time you speak of something adjacent to god, it is simply to never bother thinking of this consideration, unless you are talking directly to someone who might care and the social consequences of the speech act become practically relevant.
Of course some moral rules are important, and some appeals to guilt are valid. As far as I can tell, there is no simple rule to distinguish the good appeals from the bad ones.
However, it is IMO possible to identify certain subsets of moral appeals as invalid and broken. Ozy Brennan identifies one such subset in "The Life Goals of Dead People":
Lightcone is not an organization for people who would rather be corpses. In the pursuit of our goals, we need the courage to make choices that will violate a large number of moral guidelines other people hold. Only corpses are this certain kind of pure.
Of course, while naughtiness such defined strikes me as a prerequisite to moral greatness, it also appears pre-requisite for most forms of moral damnation. Corpses don't cause atrocities. While I am confident that in order to live a life well-lived you need to take delight in breaking some rules, taking delight in breaking the wrong rules sets you up for a life of great harm.
Indeed, the very next paragraph in the Paul Graham essay I cite above says:
Noticing the skulls is left as an exercise to the reader.
And also felt like the word "courage" deserved to be reserved for something else in kind that I might end up writing about more at a later point in time