I've thought about this a bit and I don't see a way through to what you are thinking that makes you suggest this since I don't see a reduction happening here, much less one moving towards bundling together confusion that only looks simpler. Can you say a bit more that might make your perspective on this clearer?

G Gordon Worley III's Shortform

16

Ω 3


Crossposted from the AI Alignment Forum. May contain more technical jargon than usual.