Strange as it may seem, it's been known to happen that if you optimize a dynamic system hard enough, it turns into an optimizer. The classic example would be that natural selection, in the course of extensively optimizing DNA to construct organisms that replicated the DNA, in one case pushed hard enough that the DNA came to specify a cognitive system capable of doing its own consequentialist optimization. Initially, these cognitive optimizers pursued goals that correlated well with natural selection's optimization target of reproductive fitness. Further optimization of these 'brain' protein chunks, however, caused them to begin to create and share cognitive content among themselves, after which such rapid capability gain occurred that a context change took place and the brains' internal goals no longer correlated reliably with DNA replication.
As much as this was, from a human standpoint, a wonderful thing to have happened, it wasn't such a great thing from the standpoint of inclusive genetic fitness of DNA or just having stable, reliable, well-understood optimization going on. In the case of AGIs deploying powerful internal and external optimization pressures, we'd very much like to not have that optimization deliberately or accidentally crystalize into new modes of optimization, especially if this breaks goal alignment with the previous system or breaks other safety properties.
When heavy optimization pressure on a system crystallizes it into an optimizer - especially one that's powerful, or more powerful than the previous system, or misaligned with the previous system - we could term the crystallized optimizer a "daemon" of the previous system. Thus, under this terminology, humans would be daemons of natural selection.