I'm not sure I understand your recommendation. You talk about pilot as a constraint and the obvious removal of the constraint (unmanned fighters). This is the opposite of a natural law: it's an assumed constraint or a constraint within a model, not a natural law.

I think " We have a good command of natural law at the scale where warmachines operate. " is exactly opposite of what I believe. We have some hints as to natural law in those scales, but we're nowhere near those constraints. There are a huge number of contingent const... (Read more)(Click to expand thread. ⌘F to Expand All)Cmd/Ctrl F to expand all comments on this post

Natural laws should be explicit constraints on strategy space

byryan_b 9d13th Aug 20192 comments

7


Mostly strategic developments have been about incrementing beyond whatever the other person is doing. Sometimes there are paradigm shifts, which largely mean a different dimension along which to make incremental improvements.

But we cannot increment forever. Sometimes there is a well-understood limit we cannot surpass. Energy-Maneuverability theory is a paradigm for designing air superiority fighters. Though the paradigm shifted from the old speed/altitude/turn metrics, we remained constrained by the ability of the human pilot to withstand g forces. We have already built aircraft which can climb higher, accelerate faster, and turn more sharply than men can tolerate without passing out. It may even be possible to design an almost-perfect manned fighter, in which the pilot is the constraint in all dimensions of performance.

But now we have unmanned drones.

Natural law provides a variety of limits, like the speed of light or the increase of entropy. We have a good command of natural law at the scale where warmachines operate. It seems like it would be a good policy to adopt these as the constraints on strategy-space, and map what we know about our opponents to them. This would have the benefit of letting us know how much room there even is for incremental improvements, and give us some indication of where we are vulnerable to (or have an opportunity to create) a paradigm shift.

Since most of these natural limits are well known, and most dimensions of strategy don't have something obvious like c, there isn't an obvious motivation for it. But it feels to me like even something as conceptually straightforward as using operations research or mathematical programming would work for this.