Just some stuff I found online. Not sure that all of these are actually good, but they seem promising as hunch inspiration. As usual, I'm sharing this because I am not going to be able to be the brain who uses these things to come to non-crackpot conclusions, but people who don't have enough hunch seeding from this area of thought probably are missing insights, in my view.

Previous posts in this sequence are highly related to these topics, especially the call for submissions.

Michael levin unfinished-research talk

He's been posting quite a few!

The cost of information acquisition by natural selection

Natural selection enriches genotypes that are well-adapted to their environment. Over successive generations, these changes to the frequencies of types accumulate information about the selective conditions. Thus, we can think of selection as an algorithm by which populations acquire information about their environment. Kimura (1961) pointed out that every bit of information that the population gains this way comes with a minimum cost in terms of unrealized fitness (substitution load). Due to the gradual nature of selection and ongoing mismatch of types with the environment, a population that is still gaining information about the environment has lower mean fitness than a counter-factual population that already has this information. This has been an influential insight, but here we find that experimental evolution of Escherichia coli with mutations in a RNA polymerase gene (rpoB) violates Kimura’s basic theory. To overcome the restrictive assumptions of Kimura’s substitution load and develop a more robust measure for the cost of selection, we turn to ideas from computational learning theory. We reframe the ‘learning problem’ faced by an evolving population as a population versus environment (PvE) game, which can be applied to settings beyond Kimura’s theory – such as stochastic environments, frequency-dependent selection, and arbitrary environmental change. We show that the learning theoretic concept of ‘regret’ measures relative lineage fitness and rigorously captures the efficiency of selection as a learning process. This lets us establish general bounds on the cost of information acquisition by natural selection. We empirically validate these bounds in our experimental system, showing that computational learning theory can account for the observations that violate Kimura’s theory. Finally, we note that natural selection is a highly effective learning process in that selection is an asymptotically optimal algorithm for the problem faced by evolving populations, and no other algorithm can consistently outperform selection in general. Our results highlight the centrality of information to natural selection and the value of computational learning theory as a perspective on evolutionary biology. 



Cellular sentience as the primary source of biological order and evolution

All life is cellular, starting some 4 billion years ago with the emergence of the first cells. In order to survive their early evolution in the face of an extremely challenging environment, the very first cells invented cellular sentience and cognition, allowing them to make relevant decisions to survive through creative adaptations in a continuously running evolutionary narrative. We propose that the success of cellular life has crucially depended on a biological version of Maxwell's demons which permits the extraction of relevant sensory information and energy from the cellular environment, allowing cells to sustain anti-entropic actions. These sensor-effector actions allowed for the creative construction of biological order in the form of diverse organic macromolecules, including crucial polymers such as DNA, RNA, and cytoskeleton. Ordered biopolymers store analogue (structures as templates) and digital (nucleotide sequences of DNA and RNA) information that functioned as a form memory to support the development of organisms and their evolution. Crucially, all cells are formed by the division of previous cells, and their plasma membranes are physically and informationally continuous across evolution since the beginning of cellular life. It is argued that life is supported through life-specific principles which support cellular sentience, distinguishing life from non-life. Biological order, together with cellular cognition and sentience, allow the creative evolution of all living organisms as the authentic authors of evolutionary novelty.

(closed article, can't read; looks like it's not on scihub. looks very interesting anyhow.) https://www.sciencedirect.com/science/article/abs/pii/S0303264722000818 


Semantic information, autonomous agency and non-equilibrium statistical physics

Shannon information theory provides various measures of so-called syntactic information, which reflect the amount of statistical correlation between systems. By contrast, the concept of ‘semantic information’ refers to those correlations which carry significance or ‘meaning’ for a given system. Semantic information plays an important role in many fields, including biology, cognitive science and philosophy, and there has been a long-standing interest in formulating a broadly applicable and formal theory of semantic information. In this paper, we introduce such a theory. We define semantic information as the syntactic information that a physical system has about its environment which is causally necessary for the system to maintain its own existence. ‘Causal necessity’ is defined in terms of counter-factual interventions which scramble correlations between the system and its environment, while ‘maintaining existence’ is defined in terms of the system's ability to keep itself in a low entropy state. We also use recent results in non-equilibrium statistical physics to analyse semantic information from a thermodynamic point of view. Our framework is grounded in the intrinsic dynamics of a system coupled to an environment, and is applicable to any physical system, living or otherwise. It leads to formal definitions of several concepts that have been intuitively understood to be related to semantic information, including ‘value of information’, ‘semantic content’ and ‘agency’. 


Subjective Information and Survival in a Simulated Biological System

Information transmission and storage have gained traction as unifying concepts to characterize biological systems and their chances of survival and evolution at multiple scales. Despite the potential for an information-based mathematical framework to offer new insights into life processes and ways to interact with and control them, the main legacy is that of Shannon’s, where a purely syntactic characterization of information scores systems on the basis of their maximum information efficiency. The latter metrics seem not entirely suitable for biological systems, where transmission and storage of different pieces of information (carrying different semantics) can result in different chances of survival. Based on an abstract mathematical model able to capture the parameters and behaviors of a population of single-celled organisms whose survival is correlated to information retrieval from the environment, this paper explores the aforementioned disconnect between classical information theory and biology. In this paper, we present a model, specified as a computational state machine, which is then utilized in a simulation framework constructed specifically to reveal emergence of a “subjective information”, i.e., trade-off between a living system’s capability to maximize the acquisition of information from the environment, and the maximization of its growth and survival over time. Simulations clearly show that a strategy that maximizes information efficiency results in a lower growth rate with respect to the strategy that gains less information but contains a higher meaning for survival. 



more misc results

From https://metaphor.systems/search?q=mutual%20information%20%2C%20and%20its%20information%20theory%20relationship%20to%20friendliness%20between%20beings%20in%20an%20environment mildly filtered:

  • A Thermodynamic Approach towards the Question "What is Cellular Life?"
    • The question "What is life?" has been asked and studied by the researchers of various fields. Nevertheless, no global theory which unified various aspects of life has been proposed so far. Considering that the physical principle for the theory of birth should be the one known for the unanimated world, and that the life processes are irreversibly selective, we showed by a deductive inference that the maximum entropy production principle plays an essential role for the birth and the evolution of life in a fertile environment. In order to explain the survival strategy of life in a barren period of environment, we also proposed that life had simultaneously developed a reversible on and off switching mechanism of the chemical reactions by the dynamics of equilibrium thermodynamics. Thus, the birth and evolution of life have been achieved by the cooperation between the driving force due to the non-equilibrium thermodynamics and the protective force due to the equilibrium thermodynamics in the alternating environmental conditions. 
  • Hybrid Life: Integrating Biological, Artificial, and Cognitive Systems
    • Artificial life is a research field studying what processes and properties define life, based on a multidisciplinary approach spanning the physical, natural and computational sciences. Artificial life aims to foster a comprehensive study of life beyond “life as we know it” and towards “life as it could be”, with theoretical, synthetic and empirical models of the fundamental properties of living systems. While still a relatively young field, artificial life has flourished as an environment for researchers with different backgrounds, welcoming ideas and contributions from a wide range of subjects. Hybrid Life is an attempt to bring attention to some of the most recent developments within the artificial life community, rooted in more traditional artificial life studies but looking at new challenges emerging from interactions with other fields. In particular, Hybrid Life focuses on three complementary themes: 1) theories of systems and agents, 2) hybrid augmentation, with augmented architectures combining living and artificial systems, and 3) hybrid interactions among artificial and biological systems. After discussing some of the major sources of inspiration for these themes, we will focus on an overview of the works that appeared in Hybrid Life special sessions, hosted by the annual Artificial Life Conference between 2018 and 2022. 
  • How Organisms Gained Causal Independence and How It Might Be Quantified
    • Two broad features are jointly necessary for autonomous agency: organisational closure and the embodiment of an objective-function providing a ‘goal’: so far only organisms demonstrate both. Organisational closure has been studied (mostly in abstract), especially as cell autopoiesis and the cybernetic principles of autonomy, but the role of an internalised ‘goal’ and how it is instantiated by cell signalling and the functioning of nervous systems has received less attention. Here I add some biological ‘flesh’ to the cybernetic theory and trace the evolutionary development of step-changes in autonomy: (1) homeostasis of organisationally closed systems; (2) perception-action systems; (3) action selection systems; (4) cognitive systems; (5) memory supporting a self-model able to anticipate and evaluate actions and consequences. Each stage is characterised by the number of nested goal-directed control-loops embodied by the organism, summarised as will-nestedness N. Organism tegument, receptor/transducer system, mechanisms of cellular and whole-organism re-programming and organisational integration, all contribute to causal independence. Conclusion: organisms are cybernetic phenomena whose identity is created by the information structure of the highest level of causal closure (maximum N), which has increased through evolution, leading to increased causal independence, which might be quantifiable by ‘Integrated Information Theory’ measures. 
  • Biological evolution as defense of 'self'.
  • On the Scales of Selves: Information, Life, and Buddhist Philosophy
    • When we attempt to define life, we tend to refer to individuals, those that are alive. But these individuals might be cells, organisms, colonies... ecosystems? We can describe living systems at different scales. Which ones might be the best ones to describe different selves? I explore this question using concepts from information theory, ALife, and Buddhist philosophy. After brief introductions, I review the implications of changing the scale of observation, and how this affects our understanding of selves at different structural, temporal, and informational scales. The conclusion is that there is no ``best'' scale for a self, as this will depend on the scale at which decisions must be made. Different decisions, different scales. 
  • Entropic boundary conditions towards safe artificial superintelligence
    • Artificial superintelligent (ASI) agents that will not cause harm to humans or other organisms are central to mitigating a growing contemporary global safety concern as artificial intelligent agents become more sophisticated. We argue that it is not necessary to resort to implementing an explicit theory of ethics, and that doing so may entail intractable difficulties and unacceptable risks. We attempt to provide some insight into the matter by defining a minimal set of boundary conditions potentially capable of decreasing the probability of conflict with synthetic intellects intended to prevent aggression towards organisms. Our argument departs from causal entropic forces as good general predictors of future action in ASI agents. We reason that maximising future freedom of action implies reducing the amount of repeated computation needed to find good solutions to a large number of problems, for which living systems are good exemplars: a safe ASI should find living organisms intrinsically valuable. We describe empirically-bounded ASI agents whose actions are constrained by the character of physical laws and their own evolutionary history as emerging from H. sapiens, conceptually and memetically, if not genetically. Plausible consequences and practical concerns for experimentation are characterised, and implications for life in the universe are discussed. 
  • Provenance of life: Chemical autonomous agents surviving through associative learning.
    • We present a benchmark study of autonomous, chemical agents exhibiting associative learning of an environmental feature. Associative learning systems have been widely studied in cognitive science and artificial intelligence but are most commonly implemented in highly complex or carefully engineered systems, such as animal brains, artificial neural networks, DNA computing systems, and gene regulatory networks, among others. The ability to encode environmental information and use it to make simple predictions is a benchmark of biological resilience and underpins a plethora of adaptive responses in the living hierarchy, spanning prey animal species anticipating the arrival of predators to epigenetic systems in microorganisms learning environmental correlations. Given the ubiquitous and essential presence of learning behaviors in the biosphere, we aimed to explore whether simple, nonliving dissipative structures could also exhibit associative learning. Inspired by previous modeling of associative learning in chemical networks, we simulated simple systems composed of long- and short-term memory chemical species that could encode the presence or absence of temporal correlations between two external species. The ability to learn this association was implemented in Gray-Scott reaction-diffusion spots, emergent chemical patterns that exhibit self-replication and homeostasis. With the novel ability of associative learning, we demonstrate that simple chemical patterns can exhibit a broad repertoire of lifelike behavior, paving the way for in vitro studies of autonomous chemical learning systems, with potential relevance to artificial life, origins of life, and systems chemistry. The experimental realization of these learning behaviors in protocell or coacervate systems could advance a new research direction in astrobiology, since our system significantly reduces the lower bound on the required complexity for autonomous chemical learning. 
  • Darwin : Information and entropy drive the evolution of life
    • The evolution of species, according to Darwin, is driven by struggle – by competition between variant autonomous individuals for survival of the and reproductive advantage; the outcome of this struggle for survival fittest is . The Neo-Darwinians reframed natural selection in natural selection terms of DNA: inherited genotypes directly encode expressed phenotypes; a fit phenotype means a fit genotype – thus the evolution of species is the evolution of selfish, reproducing individual genotypes. Four general characteristics of advanced forms of life are not easily explained by this Neo-Darwinian paradigm: 1) Dependence on cooperation rather than on struggle, manifested by the microbiome, ecosystems and altruism; 2) The pursuit of diversity rather than optimal fitness, manifested by sexual reproduction; 3) Life’s investment in programmed death, rather then in open-ended survival; and 4) The acceleration of complexity, despite its intrinsic fragility. Here I discuss two mechanisms that can resolve these paradoxical features; both mechanisms arise from viewing life as the evolution of . Information has two inevitable outcomes; it increases by information autocatalyis and it is destroyed by entropy. On the one hand, the autocalalysis of information inexorably drives the evolution of complexity, irrespective of its fragility. On the other hand, only those strategic arrangements that accommodate the destructive forces of entropy survive – cooperation, diversification, and programmed death result from the entropic selection of evolving species. Physical principles of information and entropy thus fashion the evolution of life. 
  • https://www.frontiersin.org/articles/10.3389/fevo.2019.00219/full 
  • https://arxiv.org/abs/1808.06723?context=math 
  • https://arxiv.org/abs/1307.4325 
  • https://informationtransfereconomics.blogspot.com/2017/04/a-tour-of-information-equilibrium.html 
  • https://arxiv.org/abs/1510.04455 
  • https://arxiv.org/abs/1510.05941 
  • https://studio.ribbonfarm.com/p/mutualism 
  • https://arxiv.org/abs/1409.4708 
  • https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7244620/ 
  • https://arxiv.org/abs/1406.5688v3 
  • https://necsi.edu/mutualistic-relationships 
  • https://www.mdpi.com/1099-4300/21/10/949 
  • https://math.ucr.edu/home/baez/information/ 
  • http://www.whatlifeis.info/pages/News.html 
  • https://math.ucr.edu/home/baez/bio_asu/ 
  • https://royalsocietypublishing.org/doi/10.1098/rsfs.2018.0041 
  • https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4384894/ 
  • http://psychsciencenotes.blogspot.com/2017/02/the-nature-of-ecological-perceptual.html 


New Comment