LESSWRONG
LW

1039
topherhunt
10060
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No posts to display.
No wikitag contributions to display.
How anticipatory cover-ups go wrong
topherhunt6d10

It would be nice to end this post with a recommendation of how to avoid these problems. Unfortunately, I don’t really have one, other than “if you are withholding information because of how you expect the other party to react, be aware that this might just make everything worse”.


Maybe this is me being naive, but this seems like a topic where awareness of the destructive tendency can help defeat the destructive tendency. How about this, as a general policy: "I worry that this info will get misinterpreted, but here's the full information along with a brief clarification of how I feel it should and shouldn't be interpreted"?

To hostile listeners, you've given slightly less ammo than in the likely scenario where they caught you concealing the info. To less-hostile listeners, you've (a) built credibility by demonstrating that you'll share info even when it doesn't strengthen your cause, and (b) by explicitly calling out the potential misinterpretation you're anticipating, you may make listeners more resilient against falling for that misinterpretation (inoculation / prebunking).

- By erring on the side of transparency while publicly acknowledging certain groups' likelihood of coming to a distorted conclusion, I bet the CDC would have avoided a disastrous erosion of public trust and reinforcement of the "don't trust the experts" vibe.
- By bringing up Bob's evasive communication during the client prep and the anxiety it created for her, Alice would have deepened trust between them (granted, at the risk of straining the relationship if he did turn out to be irredeemably thin-skinned).
- ...OK actually the cult/sect situation seems more complex, it seems to have more of the multipolar-trap (?) quality of "maybe no single individual feels safe/free to make the call that most people know would collectively be best for the group". 

It still seems to me that awareness of this trap/fallacy and its typical consequences can help a person or group make a much less fatal decision here.

Reply
The Great Data Integration Schlep
topherhunt1y30

it takes me longer to ask the LLM repeatedly to edit my file to the appropriate format than to just use regular expressions or other scripting methods myself

Not surprised. I would expect GPT to be better at helping me identify data cleaning issues, and helping me plan out how to safely fix each, and less good at actually producing cleaned data (which I wouldn't trust to be hallucination-free anyway).

Reply
What Are You Tracking In Your Head?
topherhunt3y32

When programming, I track a mixed bag of things, top of which is readability: Will me-6-months-from-now be able to efficiently reconstruct the intention of this code, track down the inevitable bugs, etc.?

Reply
Ideal governance (for companies, countries and more)
topherhunt3y70

I'm surprised that this whole conversation has happened with no mention of the minor but growing trend towards self-management organizational structures, teal organizations, Holacracy, or Sociocracy.

I have some experience with Holacracy, and while I would never call it a cure-all, I feel strongly about the relevance of its driving principles to the question of what an ideal governance system would look like -- eg. a structure of nested units/teams with high levels of local autonomy, a unique method of making governance decisions on how to change said structure, mechanisms that privilege "moving forward" over "inaction due to conflictual gridlock", fluid process for defining and appointing power-holding "roles" to individuals, etc.

Reply
Avoiding Your Belief's Real Weak Points
topherhunt5y10

you can find God killing the first-born male children of Egypt to convince an unelected Pharaoh to release slaves who logically could have been teleported out of the country. An Orthodox Jew is most certainly familiar with this episode

I've seen Yudkowsky make this point in a couple places (why bother inflicting mass infanticide etc. etc. when you're presumably omnipotent and could teleport everyone to safety) and it makes me blink, something about the argument feels off. Are there cases in the scriptures where God teleports large numbers of people large distances? I get, and vehemently agree with, the point being made here (you can't deny that in this and many other stories, God had more humane alternatives available, and knowingly opted for a crueler one) but unless there's a clear precedent for mass teleportation, this specific argument seems to strawman the religious belief a little.

Reply
0 And 1 Are Not Probabilities
topherhunt5y10

What are the odds that the face showing is 1? Well, the prior odds are 1:5 (corresponding to the real number 1/5 = 0.20)

I'm years late to this party, and probably missing something obvious. But I'm confused by Yudkowsky's math here. Wouldn't it be more correct to say that the prior odds of rolling a 1 are 1:5, which corresponds to a probability of 1/6 or 0.1666...? If odds of 1:5 correspond to a probability of 1/5 = 0.20, that makes me think there are 5 sides to this six-sided die, each side having equal probability.

Put differently: when I think of how to convert odds back into a probability number, the formula my brain settles on is not P = o / (1 + o) as stated above, but rather P = L / (L + R), if the odds are expressed as L:R. Am I missing something important about common probability practice / jargon here?

Reply