Yeah, if I had the comment to rewrite (I prefer not to edit it at this point) I would say "My whole objection is that Gordon wasn't bothering to (and at this point in the exchange I have a hypothesis that it's reflective of not being able to, though that hypothesis comes from gut-level systems and is wrong-until-proven-right as opposed to, like, a confident prior)."

G Gordon Worley III's Shortform

16

Ω 3


Crossposted from the AI Alignment Forum. May contain more technical jargon than usual.