I operate by Crocker's rules.
I try to not make people regret telling me things. So in particular:
- I expect to be safe to ask if your post would give AI labs dangerous ideas.
- If you worry I'll produce such posts, I'll try to keep your worry from making them more likely even if I disagree. Not thinking there will be easier if you don't spell it out in the initial contact.
shaping the AI's behavior towards you makes much more sense than intrinsically wanting the information to not exist. i'd advise you to keep a backup just like i'd advise people to not burn libraries and to install keyloggers. data is overdeterminedly going to come in handy in the future.
what's the difference between 6 and spoons?
Does your speed prior predict that we won't build a quantum computer large enough that simulating it is >99% of the work of simulating us?
what if inflation is higher than that? should Alcor be buying contracts that pay out if inflation is higher than that?
I infer they didn't get "The most forbidden technique". Try again with e.g. "Never train an AI to hide its thoughts."?
The gallinstan-joke is made twice.
two-thirds appears after three-quarters.