David Lynn Abel
ProtoBioCybernetics/Protocellular Metabolomics, The Gene Emergence Project, The Origin of Life Science Foundation, Inc
*Corresponding author: David Lynn Abel, ProtoBioCybernetics/Protocellular Metabolomics, The Gene Emergence Project, The Origin of Life Science Foundation, Inc.
Received: 16 September 2024; Accepted: 20 September 2024; Published: 07 October 2024
Life is programmed, logic-gate-controlled, and cybernetically processed. Life is algorithmic with sequentially completed (“halting”) operations. Homeostatic metabolism is mediated by highly integrated circuits of configurable switch-settings, both genetic and epigenetic. Prescriptive Information (PI) controls, not just constrains, life’s exquisitely functional processes. Life’s instructions are conceptual, not just complex. Life is Computation. Naturalistic science has been pursuing abiogenesis and life’s definition on a purely physicodynamic basis for many decades with great frustration. Increasingly, investigators have been more willing to acknowledge prominent elements of life’s formal orchestration. Physicalism fails to explain the reality of “systems biology,” bona fide formal organization (as opposed to the merely redundant self-ordering of fractal, chaos and complexity theories), ingenious semiosis using representational physical symbol vehicles in material symbol systems, various abstract codes, superimposed multidimensional coding in the same string of symbols, linguistic-like rules rather than laws, and controls (rather than mere constraints). None of these formal aspects of sub-cellular metabolism are reducible to mere Shannon Uncertainty measures or irreversible nonequilibrium thermodynamic “possibilities.” “Assembly Theory” fails miserably to explain, let alone measure, the difficulty of orchestrating homeostatic metabolism. Life’s processes often seem to be indistinguishable from artificial computation by digital devices. Could it be that computation is the essence of life’s elusive definition? Life is not a thermodynamic state. Life demonstrates persistently programmed computational “processes.” Life is programmed and conducted by pragmatic cybernetic “operations.” Life consists of Sustained Functional Systems (SFS) that transduce useless wasted energy into usable energy, carrying life uniquely far from equilibrium. No function known to modern engineering measures up to the sophistication of sub-cellular nanocomputer and molecular machine biofunction. Is life’s computation merely physicodynamic, or is it every bit as abstract, conceptual, nonphysical and formal as the mathematical laws that define and govern physicality?
Life Definition; Protocells; Abiogenesis; Life Origin; Molecular Evolution; Chemical Evolution; Pre-Darwinian Evolution; Computational Biology; Self-organization; Emergence
Life Definition articles; Protocells articles; Abiogenesis articles; Life Origin articles; Molecular Evolution articles; Chemical Evolution articles; Pre-Darwinian Evolution articles; Computational Biology articles; Self-organization articles; Emergence articles
First, we must investigate the hypothesis of whether life is fundamentally formal computation rather than mere physicodynamic interactions. We live in a mass/energy physical world. We also live in an abstract, nonphysical, conceptual, formal world. Hamming [1] and Wigner [2] both pondered “the unreasonable effectiveness of mathematics in the physical sciences.” We wish to keep naturalistic science uncompromised. But we also use formal choice-based engineering, mathematics, logic theory, linguistics and computation to accomplish most every naturalistic scientific endeavor. We cannot practice the scientific method without assimilating formalisms into our concept of reality. The highly desired “Theory of Everything” would be mathematical, not physical. Thus, is the effectiveness of mathematics in the physical sciences really “unreasonable?” Perhaps what is unreasonable is demanding that metaphysical physicalism (naturalism) be the starting and limiting axiom of scientific investigation.
Life has always escaped definition.
Around the turn of the millennium, I spoke at a world-wide conference in Italy of life-origin scientists who sought to define life [3]. A written definition of life was required of every scientist participating in that conference. No two definitions were the same! No satisfactory definition of life has been published since, either. Reasonable descriptions of life have been published [4-8]. Science’s reductionism to a single-celled organism helps. We theorize protocell toy models [9-25]. NASA’s emphasis is usually placed on the ability of a cell to grow, reproduce, pursue functional activity, and evolve. Not so clear is how any protocell acquired any of those traits. The elucidation of abiogenesis and the study of protocellular metabolomics offers the best hope of understanding most fundamentally what life is, and exactly how life is distinguished from inanimate physicodynamics.
Perhaps the reason abiogenesis presents such an elucidation problem for science is that philosophic naturalism tends to be unwilling to experiment with anything other than physicodynamic interactions alone. At the same time, biology is constantly confronted with the reality of conceptual and goal-oriented “biochemical pathways” [26-30], highly integrated biological circuits [31-36], exquisitely orchestrated “homeostatic metabolism far from equilibrium [37-39],” “programming” [40-45], Szostak’s “functional information” [46-51], Abel’s more refined “Prescriptive Information” (PI) [52-54], “computational biology” [55-60], “systems biology” [61-66], abstract representational “symbolization with codes” [67-74], superimposed, multidimensional codes [75], chaperone control of protein folding into the needed tertiary structures [76-79], “biosemiosis” [46,73,80-87], “transcription” [88-92] and “translation” [93-96]. A virus doesn’t process anything. A virus performs no operations. A virus is not a system. It pursues no goals. A virus is just an inanimate database. A virus is not alive. Its host is alive. The host incorporates a virus’s Prescriptive Information (PI) [43,52-54,97] into its own genome.
Whatever life is, its programming, processing with logical operations, and subcellular systems seem to be life’s essence. That is worth repeating:
Life is most fundamentally controlled (not merely constrained!) by cybernetic processes. Life involves what artificial cybernetics calls “operations” purposely directed toward computational success (halting). Life is characterized by integrated circuits and formally orchestrated highly cooperative formal biosystems. Life pursues the undeniable goal of being alive and staying alive. Something is very wrong with limiting biology’s attempts to elucidate life’s mechanisms to nothing more than inanimate physicodynamics. Inanimacy is simply not capable of accomplishing the wish-fulfillments of “self-organization” or “emergence” [98,99]. The latter two concepts are nothing more than pre-biotic pipe dreams. There is absolutely nothing scientific about either. Prigogine’s mere self-ordering was distinguished from formal organization years ago in life-origin science [99,100].
If we are ever able to define life, that definition will be derived from programming, cybernetic processing, logic gate operations, well-orchestrated systems, and continual fully-realized homeostatic metabolic integration and goals far from equilibrium in Sustained Functional Systems (SFS)[101]. Such systems invariably require Maxwell’s demon’s Choice Causation of when to open and close the trap door between thermodynamic compartments. If we disallow active selection of when the demon opens and closes his trap door, not even the simplest conceivable heat engine can be generated by thermodynamics [101], complexity [102-106], chaos [103,107-109] or “Assembly Theory [110-115].”
But now we have a problem. The demon’s trap door must be controlled, not merely constrained. All of the necessary causes of orchestration and bona fide organization are formal, not physical. The choice of when to open or close the demon’s trap door between thermodynamic compartments must be an active selection, not a passive selection [116]. The openings and closings of the trap door are not caused by any later-realized superior functional fitness as with environmental selection. Maxwell’s demon’s choices cause the potential function. They are as abstract and conceptual as mathematics itself. Such active selections cannot be reduced to mere physical complexity [117]. The self-ordering of chaos theory cannot explain a single one of these phenomena. Irreversible nonequilibrium thermodynamics cannot orchestrate formal organization. Assembly Theory cannot actively select for “useful” moieties over non-useful moieties. An inanimate environment shows no preference for function over non-function. Physicodynamics has no goals. The required active selection simply transcends irreversible nonequilibrium thermodynamics, chaos theory, molecular evolution and Assembly Theory’s capabilities.
But all the molecules and energy of life are physical, aren’t they? It could be argued that this manuscript is physical. But in reality, the concepts herein presented are purely abstract, conceptual, linguistic, mathematical, hopefully logical, nonphysical and formal. This paper’s formalisms are just instantiated into physicality using digital recordation and retention methods, along with use of other physical media like molecules of ink and paper. Yet this manuscript simply cannot be reduced to physicality. Neither can the scientific method or reality in general.
What exactly is computation?
We tend to think of computation as an act of mathematical calculation. But as Babbage’s co-worker—mathematician Ada Lovelace in 1843 first pointed out, computation does not have to be arithmetic. It does not actually have to be a “calculation,” either. Computation’s broader meaning is simply “information processing” [118]. Here we would first have to differentiate between Shannon Uncertainty (misnamed “information”) and real information. Real information is Szostak’s “functional information” [46-51] and Abel’s more refined “Prescriptive Information” [43,52-54,119]. The human mind can compute, but physically instantiated cybernetic technology can compute ever so much faster, more reliably and more efficiently. Initial “technology” may have been the abacus. But the abacus required direct human involvement in each step of computation.
In one of the most profound engineering feats of all time, Jacques de Vaucanson in 1745 invented a mechanical device in the form of a semi-automated loom. The machine read punched paper tapes down to the next row. The loom was still manually controlled, however. Joseph-Marie Jacquard’s card-punched looming machine in 1804 was probably the first mechanical automated “machine” capable of executing commands and producing functional algorithmic success.
“Punch cards” were creatively used to instantiate “active selections”-purposeful choices-into a physical medium of choice retention and readable commands used to prescribe automated desired utility. The cause was choice contingency. The effects were “useful results” as judged by agents. The punched holes prescribed which thread should be used in what position in the fabric. de Vaucanson and Jacquard had found a way to instantiate abstract concepts of the mind into physicality. Of greatest interest were ideas that issued instructions and commands that could direct physical operational functions. The location of the holes was used as a modified sign or symbol to represent the idea’s meaning, purpose and commands.
Computation is not caused by indifferent constraints or invariant physical law. The holes in certain places of the physical card medium represented active selections from among real physical options. They provided not constraints on physicality, but bona fide formal actively selected controls of physicality. The reason these inventions were so significant is that The Cybernetic Cut [120,121] had been traversed by technology across an infinitely deep ravine from its far formal side to its near physicodynamic side. Abstract, nonphysical actively selected concepts had caused effects into the physical world using a machine. Choice Causation had been successfully automated into mass/energy effects. The holes in the physical cards instantiated the desires and purposeful choices of the weaver agent. The formalism of conceptual, abstract Choice Causation (as opposed to chance and necessity) was transcribed and translated into a physical medium [43,122,123]. The cards were physical; the machine was physical. But the controls that produced the sophisticated physical product were purely formal.
How could this infinitely deep Cybernetic Cut ravine possibly have been traversed? It was traversed via the very narrow one-way bridge called the ‘Configurable Switch Bridge” [120,121,124]. This bridge is constructed through the design and engineering of physical configurable switch-settings capable of recording, storing and executing voluntary conceptual formal commands. The commands consist of decision-node choices in pursuit of functionality. That functionality is desired by agent programmers and algorithmic optimizers. Certain configurable switch-settings were made using active selections from among real physical options. Circuits can only be integrated by formal Choice Causation, not by irreversible nonequilibrium thermodynamic “possibility space.” The light switches on our walls provide a simple example. The configurable switches themselves are physical. But their causal operation and control is altogether abstract, conceptual, nonphysical and formal. The force of gravity doesn’t turn the light off. Only active selection of which configurable switch setting is desired turns the light switch off. The light is operated by Choice Causation, not by physicodynamic causation. Configurable switches are designed and engineered so that none of the four known forces of physics can select the switch-setting. Only one cause turns the light off-active selection—purposeful choice, not the force of gravity.
Improving card-punching methodologies further advanced computational technology. Babbage’s later Analytical Machine just expounded on the same basic ideas into “autonomous operations.” Eventually, mechanical machines were replaced with electronic devices that came to be called “computers.” Cards with holes punctured in certain locations were replaced with electromagnetic tapes. These tapes in turn were replaced with stamped integrated circuits and many other later technologies. All of these approaches involved the instantiation into physicality of purely formal purposeful choices made ultimately by agents doing the programming of physical machines and their operations/processes. Eventually, it was realized in the 20th century that programming did not have to be sequential. “Conditional branching” would eventually allow control transfer to a different instruction stream depending on the value of specified data.
What are Operations?
Another term of interest in Babbage’s history was “Operations.” What do we mean by “operations”? Operations perform some task, procedure or process that is useful according to some agent’s definition of “useful.” The agent desires the completion of some task or procedure for some reason of perceived value: some desired final function. Remove the inventor agent from the equation, and no analytical machine would have ever “self-organized” or “emerged” in any amount of eons of time in any multiverse. All that would have occurred would have been sequences of undirected physical interactions. This is an absolute logic-theory prohibition, not a relative best-thus-far inductive prohibition. The Church/Turing thesis declares that computable “functions” are accomplished only by “effective procedures”—purposeful operations that achieve some goal-that can be conducted by a Turing “machine.”
To explore this thesis, we must first ask, “What is a “function?” A function is normally the product of a formally directed process, not a physicodynamic or thermodynamic state. Second, what is a “procedure, and what is meant by “effective?” A procedure that is “effective” (efficacious) is typically a processed step-wise algorithm. To be most effective (produce the “fittest” product), algorithms must be optimized. All of these concepts are formal, not physical.
Finally, what is a “machine?” A machine, including subcellular nanocomputers and molecular machines, are devices capable of processing those efficaciously programmed procedures-certainly nothing merely physicodynamic. Such machines have to be designed and engineered into existence. Like cell phones, they don’t just happen. Electronic configurable switches instantiate logic gate choices that permit cybernetic “operations.” Physical methyl groups mediate epigenetic configurable switches. Sequential specific nucleoside active selections program sequential codon and translational pausing instructions. Although physical, their effects all depend upon formal Choice Causation of which nucleoside to polymerize next in the string, not thermodynamic possibility, chaos or mere complexity.
What is Cybernetics?
Cybernetics refers most fundamentally to the communication of “steering” and “control.” Automated control systems in both machines and in living things are addressed by Cybernetics. Cybernetic control includes circular processes such as feedback mechanisms where outputs can serve as inputs. But mere recursiveness does not necessarily establish any useful control. Presupposed in any means of bona fide control is the motive to achieve utility of some sort. Vast technological improvements have not altered the basic modus operandi of automated cybernetics. A cybernetic logic gate is the equivalent of a physical configurable switch. The machines needed to process programming have to be designed and engineered into existence. The few “simple machines” that might be mentioned as exceptions are only machines when agents use them as machines. For example, an inclined plane is not a machine when the wind blows a tumble weed up a hill. An inclined place is only a simple machine if an agent uses the inclined plane to accomplish some useful goal (pushing a heavy crate up some steps). Following publication of Kurt Gödel’s famous incompleteness theorems in 1931, many of the refiners of modern cybernetics got their ideas from the progressive discovery of the cybernetics of subcellular digital controls that were just beginning to be elucidated more fully in the 1950’s” [125-133]. The parallels they perceived between real subcellular cybernetics and potential artificial cybernetics were profound. Thus, it might behoove naturalistic abiogenists to address the question, “How did computer-like computation get started at the sub-cellular level in a purely physicodynamic prebiotic world?”
The capabilities of physicodynamics
Countless interactions and reactions occur spontaneously in nature. These interactions can self-order into dissipative structures a la chaos theory [134-139]. But they have never been observed to formally self-organize into formally controlled processes or systems in pursuit of sophisticated functions. Undirected physicodynamics is blind to utility. Programming or engineering is required to generate an “effective functional procedure,” otherwise known as an algorithm. Thus, physicodynamics can self-order, but physicodynamics cannot orchestrate or organize anything into existence, let alone itself. Self-organization is a nonsense naturalistic term. Neither Chance nor necessity can organize anything.
For constraints to become controls requires an agent to actively select those constraints, usually by orchestrating initial conditions. It’s called “investigator involvement” in experimental design. This flaw results in the illusion of physicodynamic events having formal capabilities. Typically, the ultimate source of steering and control is what philosophers of science call “agency.” Agency refers to the ability to choose from among real options within and despite the context of physical law constraints. Purposeful operations are undertaken only by agents. Instrumentality is engineered to optimize efficiency of tasks. Non-trivial algorithmic optimization has never been observed to arise out of anything other than Choice Causation attributable to agents. We dichotomize “natural process” from engineering for good reason. Virtually every sophisticated entity known to science arose form engineering, not “natural process.” We see self-ordering in nature all the time in the form of dissipative structures. But we have never observed a simple piece of wire with constant diameter and the needed malleability and tensile strength spontaneously emerge from unaided, uncontrolled physicodynamics (See Figure 1).
Figure 1: An old, oxidized piece of wire. What empirical evidence can we cite of this simple long piece of metal alloy with constant diameter ever having spontaneously “self-organized” or “emerged” from iron ore in the ground? How did this simple piece of wire come into existence with just the right tensile strength and malleability to make even a crude paper clip?
Figure 1 offers the scientific reductionism needed to appreciate the silliness of believing in the spontaneous generation of life. If a simple piece of wire has never “self-organiz3d” or “emerged” from the forest floor by physicodynamic interactions alone, what would be the rational or empirical basis for any competent scientist seriously believing in the spontaneous generation of life?
Algorithms are crafted to solve problems.
An algorithm is an operation consisting of a finite sequence of instructions or commands undertaken to solve a class of problems. Note first that even the statement of a “problem” that needs solving is formal, not physical. Algorithms are instructions of well-defined step-wise processes made in pursuit of functional goals. The goal of solving any problem is formal, not physical.
The formal terms “operation,” “process,” “organization,” and “system” are often bastardized into illegitimate use as physicodynamic terms. Examples include naturalists’ use of the term “natural process” and meteorologists use of the formal term “system” to refer to a mere weather front. “Natural process” is actually nothing more than an un-steered sequence of cause-and-effect physicodynamic interactions. Physical law causations of effects such as Prigogine’s self-ordered states (dissipative structures like tornadoes) are not formal “processes.” They involve no controls, only fixed laws and constraints [140]. No steering or purposefully controlled algorithmic procedures are involved. Weather fronts, tornadoes and hurricanes have nothing to do with “systems” or “processes” in pursuit of utility. They are merely a rapid sequence of inanimate cause-and-effect self-ordered physicodynamic states. Dissipative structures of chaos theory only destroy algorithmically processed organization, systems and computational feats. They never create orchestration or formal organization of any kind.
Today, algorithmic operations are usually programmed to run autonomously on computers. Such well-defined step-wise processes are governed by rules, not laws. Formalisms are choice-based with intent to achieve a “desired” result. They cannot be generated by “chance and necessity.” Rules address contingency not eliminated by the laws of motion. Rules are formally generated to strongly recommend voluntary behavior. Rules outline what choices would be wise and efficacious in the accomplishment of some goal. Rules are purely formal because they guide effectual active selections at true decisions nodes toward utility. Rules, unlike physical laws, can be broken. Rules can be misunderstood and knowingly or unknowingly disobeyed. The typical result of failing to obey rules is loss of final function for which the rules were written. The breaking of rules does not break any laws.
Shannon Uncertainty and Possibility measures
Shannon theory can quantify the number of physical options. It cannot quantify effectual choices. Prior to specific selection of a real option, a binary decision node appears as a mere fork in the road. Forks-in-the-road can be altogether physical. The question of which fork should be taken to optimize one’s journey is altogether formal. The choice is an active selection, not a passive or secondary selection as we might see with natural selection. To program computational function, active selections must be made at bona fide decision nodes according to rules set prior to the realization of beneficial results. The choices are prescribed prior to the completed execution of the computational operation. There’s no way of knowing for sure whether the computation will even finish. This is what is known as the Church/Turing Halting problem [141] [142]. Halting refers to the program successfully computing the task for which the programming choices were made, and the computer halting its operation. Computation is altogether choice-based at bona fide “decision nodes,” not mere “bifurcation points.” “Bifurcation points” (forks-in-the-road) are not synonymous with “decision nodes.” Forks in the road correspond to physical configurable switches that have not yet been set by choice contingency. Once a setting is chosen, only then does the fork-in-the-road become a functional “decision node” and configurable switch setting. Decision node choices comprise 90% of what makes reality really interesting. Algorithms can only be optimized by improving the quality of choices which constitute the instructions. Can chance and necessity make or refine programming choices? Can mutations selectively generate or optimize algorithms? Can Prescriptive Information be programmed and improved by physicodynamics? Philosophic naturalism knows full well the answers to these questions, but continues to disingenuously obfuscate.
What is Formalism?
Dictionary definitions of “formalism” relate to adherence to “prescribed forms” of voluntary behavior. In natural science and philosophy, these prescribed forms are usually defined mathematically, logically and linguistically. The bottom line of formalisms is Concept in pursuit of Functionality, Steering and Control.
Note that “mathematical,” “logical,” “linguistic,” “prescribed,” “controlled,” “operations,” “procedures,” “processes,” “abstractions,” “calculations,”
“computations,” “function,” “utility,” “usefulness,” “behavior,” “orchestration,” “organization,” “classes” and “forms” are all nonphysical, abstract concepts of mentation. Formalisms, including the manipulation of mathematical equations, invariably involve choice-contingent causation, not physicodynamic causation. Formalisms all arise from the far side of the Cybernetic Cut [120,121,124], not the near physicodynamic side.
Active selections are made prior to the realization of any final function that might later be judged to be “fittest” or “favored.” “Survival of the fittest” is always after-the-fact of optimized algorithmic function. Evolution tells us absolutely nothing about how those algorithms were written or optimized at the genetic and epigenetic levels [143,144].
Co-evolution models exist. But upon careful analysis of all these models, formal elements are at play that deny the contention that the models were purely physicodynamic. Usually, experimental confirmation is lacking. Models are more wish-fulfillments consisting of “could have’s . . .” than empirical science. Neither chance nor necessity can explain algorithms, algorithmic optimization, the valuing and pursuit of utility, computation, or successful computational halting success. Chance and necessity cannot explain the programming of Prescriptive Information (PI) instantiated into genetics and epigenetics [52-54]. Biological controls and systems are a cybernetic and engineering problem, not a natural science problem. One can spend a lifetime studying physics and chemistry without ever being able to explain efficacious programming choices. They are active selections, not physicodynamic effects. They are formal, not physical. They are not passive secondary selections based on already achieved final phenotypic fitness. Non-trivial fitness can only be genetically and epigenetically caused by formal active selection. This is known as The Genetic Selection Principle:
The GS (Genetic Selection) Principle states that biological selection must occur at the nucleotide-sequencing molecular-genetic level of 3'5' phosphodiester bond formation and selective methylation of epigenetic configurable switches. After-the-fact differential survival and reproduction of already-programmed, already-living phenotypic organisms (natural selection) does not explain polynucleotide sequence prescription, efficacious epigenetic configurable switch-settings, and other forms of biological coding prescription [105,145-147]. Synthetic organic chemistry requires tightly controlled conditions and a highly specific order of reactions. Choices have to be made long before any usable yields of moieties are produced, especially with the extremely high purity that is required [148-150].
Cybernetic Abstractions are formal.
Programming functions (typically, series of commands) can be condensed by abstractions. Abstractions are summary statements and names (formal representations or symbolizations) of calls to function. They encapsulate the details of how a problem-solving effort will be implemented [151,152]. The abstractions describe the modules and sometimes the abstract class of desired goal details in a project. The abstraction represents what that module does to contribute to solving a problem. The abstraction does not go into detail of how it codes for the solution to that problem.
Abstractions wisely focus on the most important aspects of solving a complex problem first, to the exclusion of less important aspects of that problem. Abstractions can be quite complex, consisting of many layers or dimensions of smaller commands or modules that are organized into “libraries.’ They not only offer the simplicity of brief description of complex function, but of composability in abstracting many modules into larger programming modules. But, of course, all this has to be planned. Programmers call this “analysis and design.” The abstractions have to be very creative and manifest optimized algorithmic efficiency. Why are we talking about artificial cybernetic programming abstractions in a life-definition paper? The answer is that Life at the subcellular level uses extensive abstractions in its programing to solve many multifaceted metabolic problems.
The most conceptually complex (not just complex) problems known to science are those of orchestrating homeostatic metabolism at the subcellular level. Even a theoretical protometabolism requires ingenious organization and controls. These facts constitute the reality to which so many naturalistic abiogenists are blind, or choose to be blind: the abstractions of programing code are nonphysical and formal. They cannot be reduced to physicodynamics. They cannot be addressed by the “natural sciences.” The abstractions of program modules are engineering realities. If chance and necessity cannot generate a simple piece of wire, how did chance and necessity abstract the programming concepts needed to generate ribosomes? Ribosome anatomy alone is mind-boggling even before analyzing these highly sophisticated digital device “operations.” The extraordinary function of chaperones only compounds the problem of even beginning to provide a physicodynamic explanation for such exquisite orchestration of the protein tertiary structure roles of each highly functional player.
Subcellular computer-like computation.
Inanimate nature knows nothing of “goals,” “directed processes,” “intent,” “desire” “instructions,” “commands,” “algorithms,” “usefulness” or “function.” Why are such human-generated artificial cybernetics concepts so prominent in subcellular metabolism?
It is not a matter of how human experimenters conceive molecular assemblies, as Assembly Theorists argue. Abiogenesis by definition must arise out of inanimacy [153]. Inanimacy produced stand-alone physicodynamic causation. Present-day human conceptualizations of moieties are irrelevant to abiogenesis research.
Life uses Material Symbol Systems [154,155]. Physical symbol vehicles (tokens) represent cybernetic commands (e.g., triplet codons) [156]. Such representationalism is formal, not physical [14]. But how could a physical symbol vehicle be generated with “meaning” that could execute the desired command and function? Nucleoside selection for polymerization at a certain locus in the string of commands is not physiochemically mandated. All chemical bonds are the same 3’5’ phosphodiester bonds. No physical determinism of coding in polymerization exists. If it did, instead of Prescriptive Information we would have nothing but meaningless homopolymers. The problem of prescription of biofunction is by far the most plaguing problem of abiogenesis research [53,54,119].
Life’s symbol systems needed formal rules of interpretation and execution, not fixed laws. Where did these formal rules come from? With the representation of commands came the possibility for the first time of a machine being designed and engineered to “read” and execute each command formally represented by a physical symbol vehicle. This didn’t happen first with de Vaucanson, Jacquard or Babbage. It happened first at the subcellular level with abiogenesis! [157]. Sometimes, multiple signs or symbols were needed to express a simple command. Other times, multiple commands were needed to complete even a simple task. The sequence of symbolized commands was the equivalent of language syntax (e.g., a “phrase” or “clause” of instruction). This resulting linguistic syntax gave rise to the first “program.” It’s instantiation into the physical world gave rise ultimately to a prebiotic “Turing tape.” The Turing tape was like a bunch of punch cards scotch-taped together in a long string. But the latter was worthless without a very sophisticated “reading head” and Turing-like machine. A large percentage of life’s description, if not definition, falls into the engineering category, not the natural science category. Life is fundamentally formal rather than physical, although the formalisms are instantiated into physicality with physical configurable switch and logic gate settings. But those switches and logic gates can only be set for function by formal active selections. The after-the-fact “natural selection” of fittest already-halted computations will never explain the programming, processing and computation that is the essence of life.
Life is Unique
The programming of life is undeniably digitally prescribed and cybernetically processed [4,52-54]. Sub-cellular nanocomputers convert the linear digital sequencing of material symbol systems (mRNA “Turing tapes”) into digitally prescribed primary poly-amino-acid strings that only secondarily fold into needed three-dimensional tertiary structures [158]. Various semiotic codes are not always read from the 5' to 3' direction. Various codes are superimposed and are sometimes read in both directions.
Long before Babbage, prebiotic nature had already made sure that each individual operation would finish (halt) before the next operation was commanded. Are life’s programmed processing “commands” physicodynamic causes, or cybernetic engineering causes? What does physico-chemistry “command.” Laws don’t need to command; they simply constrain and force outcomes. The context of “Commands” is to strongly suggest actively selecting certain options from within Shannon possibility space in order to achieve utility. But at the programming and operational Turing machine level, those suggestions have been converted into prescriptions (choice-contingent efficacious orders). The “interpretive competence” of subcellular biosemiotics and biocybernetics cannot be reduced to physicodynamic laws and constraints. Neither is the formal orchestration of life’s bioengineering reducible to metaphysically-motivated physicalistic philosophy.
Is “computation” an accurate descriptor of life? Absolutely. Subcellular computation is what life does. Information processing is the very essence of life. Life also depends upon sophisticated equipment—subcellular nanocomputers and exquisitely tailored and engineered molecular machines. For even the simplest heat engine to form, Maxwell’s demon must make intelligent choices of when to open and close his trap door [101]. If Maxwell’s demon’s trap door is operated either by chance or necessity, no simple heat engine will arise.
How did all of these devices come into existence in a prebiotic environment at the same time and place? This equipment must “speak the same language” at the messages’ destination that was spoken at the messages’ source.
Any model of hypothetical spontaneous production of these machines by irreversible nonequilibrium thermodynamics, “assembly theory,” chaos theory, or complexity theory models is statistically prohibitive. In addition, such models measure out with a Universal Plausibility Metric of ξ < 1.0, requiring their rejection by peer review as plausible scientific models of origin [159-161]. Such wish-fulfillment models are unworthy of grant funding or publication.
When no Choice Causation is allowed into scientific study, not even the source of a simple heat engine can be elucidated. Maxwell’s demon cannot be excluded from any mechanism of non-trivial machine formation. No Choice Causation—no subcellular nanocomputer or molecular machines! No Choice Causation—no Life! It’s that simple. The superimposition of a completely different coding scheme for translational pausing on top of the codon scheme in the same mRNA further compounds the incredible degree of formalism.
The codon table and other biosemiotic code systems are highly conceptual. Homeostatic metabolism manifests innumerable purely formal concepts and controls, not just physicochemical interactions. Life’s processes are beyond ingenious. The sub-cellular cybernetics of life is more sophisticated than the finest mainframe computer systems.
NeoDarwinism doesn’t select nucleosides or program their sequencing. Evolution is nothing more than the differential survival and reproduction of the fittest already-programmed, already-cybernetically processed, already-living organisms. The model is worthless in trying to explain abiogenesis. Co-evolution models of code-origin can only go so far before their brevity, simplism and naivete become painfully obvious. Transcription and translation are vastly more extensive, cybernetically creative and refined than cartoon co-evolution models will ever be able to explain. What directed biochemical pathways toward undeniable goals of metabolic holism? NeoDarwinism only selects for the fittest already-living organisms, not the fittest stand-alone functions. All of the biochemical pathways must be first integrated into unified protometabolic schemes. Not even natural selection can advance without prior active selection. Only formal algorithmic optimization could have produced the fittest organisms for evolution to secondarily prefer [145-147]. Evolution has no goal, and no pursuit of function or efficiency. NeoDarwinian evolution can play no role whatsoever in abiogenesis. There has to be at least viable reproducing protocells that can differentially die or survive better for natural selection to begin. There’s good reason why we dichotomize “Science and Engineering.” Engineering requires controls, not mere constraints. Naturalistic neuroscience has failed miserably to reduce formal controls to nothing more than fixed laws and physical constraints [140].
When science restricts its study to Chance and Necessity, it hits a brick wall explaining what makes most of reality “effective.” To exclude repeatedly observable active selection and its considerable effects from the study of objective reality leaves the science of biology severely crippled. Science cannot practice its trade while in denial of the F > P Principle: “The F > P Principle (Formalism > Physicality) states that “Formalism not only describes, but preceded, prescribed, orchestrated, organized, and continues to govern and predict Physicality.” The F > P Principle is an axiom that defines the ontological primacy of formalism in a presumed objective reality that transcends both human epistemology, our sensation of physicality, and physicality itself. The F > P Principle works hand in hand with the Law of Physicodynamic Incompleteness, which states that physicochemical interactions are inadequate to explain the mathematical and formal nature of physical law relationships [162,163]. Physicodynamics cannot generate formal processes and procedures leading to nontrivial function. Chance, necessity and mere constraints cannot steer, program or optimize algorithmic/computational success to provide desired nontrivial utility.” [143,144]. Naturalistic biological science cannot continue to bury its head in the sand by studying only physicodynamic causation. Any theory of everything would itself be a formalism. It would have to address the reality and origin of formalisms that transcend and control all aspects of physicality, not just life [43,116,119,143,144,163-166]. Naturalistic science is absolutely crippled by its failure to admit the clear dichotomy between mere physicodynamic constraints vs. formal choice-contingent controls. Unfortunately, this is quite deliberate. It is metaphysically motivated in violation of Einstein’s wise advice to minimize metaphysics. The inability to define life and explain its abiogenesis by the naturalistic scientific community stems directly from its refusal to acknowledge that life is fundamentally formal rather than physical [167]
The science of Biology must acknowledge the reality of formal programming, the cybernetic processing of that programming, integrated circuits, and the orchestration of highly integrated biosystems if it is to make any real progress in elucidating abiogenesis. Refusal to acknowledge prebiotic flow from the far side of the Cybernetic Cut [120,121,124] precludes understanding the introduction of any abstract concept or formalism into physicality. Disallowing abstract concepts as causation in biological science means disallowing controls and computation. No computation means no information processing, no completed functional operations, and no realized homeostatic metabolism. The F > P Principle cannot be denied or circumvented [143,144]. The time is long overdue to admit the reality of a crippling Kuhnian paradigm rut that dominates naturalistic science [168]. There is a very good reason why computational biology and systems biology are so effective in studying life and homeostatic metabolism: life itself is fundamentally computation. Computation is as formal as mathematics. That means that life is fundamentally formal even though instantiated into physicality. Formalisms are choice-contingent and are nonphysical. They cannot be reduced to mass/energy and the four known forces of physical nature. Physicalism is dead.
All results are contained within the manuscript and its references.
Authorship contribution statement David Lynn Abel alone contributed the writing - review & editing, Visualization, Validation, Supervision, Software, Resources, Project administration, Methodology, Investigation, Funding acquisition, Formal analysis, Data curation, Conceptualization.
The author declares that he has no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
The author gratefully acknowledges a grant from The Origin of Life Science Foundation, Inc. #ABE6-2024.