What is Ontodynamique?
A minimal formal framework for distinguishing closure, carrying, and aggregate through a single criterion: where does irreversibility land under perturbation? Ontology, LLM, virus, institutions, subjectivity
1. Four objects our frameworks misclassify
A large language model manipulates invariants, generates coherent text, adjusts its responses under feedback — yet regenerates none of the material conditions of its own operations. A virus replicates, mutates, exerts selective pressure — yet borrows its entire machinery from the host cell. An institution recruits, trains, sanctions, survives the complete replacement of its members — but some self-maintain while others hold together only through external life support. A clinical symptom shields the subject from a more severe breakdown — then becomes autonomous and governs what it was meant to protect.
These are not marginal curiosities. They are the ordinary objects of our century. And our frameworks misclassify them. Reductionism flattens them by seeing only components — rendering the emergence of any self-produced normativity unintelligible. Dualism splits them in two by opposing matter and meaning, creating an explanatory gap it cannot then bridge. Process-based approaches connect much, but discriminate little: if everything becomes process, relation, network, event, there is still no principle for distinguishing what sustains itself from what is merely carried.
We end up with either an undifferentiated continuity — everything becomes “a system” — or a fragmentation where each domain demands its own concepts with no shared operator.
A criterion is missing. Not another theory that would describe these objects in its own language, but a single operator that discriminates them through a test. Ontodynamique proposes that criterion.
2. The single operator: where irreversibility lands
The question Ontodynamique asks is this: when a system is perturbed, who materially bears the cost of adjustment — and from what margin is it drawn?
The word “cost” is not an economic metaphor. It designates a structural invariant: the asymmetric draw on a finite margin that any maintenance of a determination under pressure requires. The energy dissipated by an organism, the wear of a component, the technical debt of a software system, the allostatic load of a human subject — these are different refractions of the same invariant across different levels of organisation. Cost is one; the diversity of costs is an operative refraction.
This criterion produces a gradient with three terms:
Operational closure designates a system that regenerates its own conditions of functioning. When perturbed, it compensates by drawing on its own finite margin. The perturbation leaves a trace — a scar. The system possesses a constitutive normativity: the distinction between what maintains the cycle and what compromises it is produced by the cycle itself.
Normative carrying designates a system that maintains an invariant — a motif, a pattern — but whose material irreversibility is externalised onto a carrier’s infrastructure. The pattern can return to an identical description — rollback — while the support has paid the cost. Normativity is attributed, not self-produced.
The aggregate undergoes perturbation and is passively altered. No cycle of its own, no margin to draw on, no normativity. Persistence by inertia alone.
Between carrying and closure, the gradient is continuous. But the test is always the same: strike and observe. Closure scars; carrying restarts.
3. What it decides: six verdicts
Let us apply the criterion.
The LLM is a normative carrying. The pattern — the weights, the statistical invariants — is restorable by rollback. Material irreversibility (silicon wear, energy consumption, chip degradation) is entirely externalised onto the infrastructure. The question “is an LLM an individual?” receives a structural answer: no — not because something mysterious is missing, but because the locus of cost-bearing lies elsewhere. Computational completeness says nothing about the locus of cost-bearing.
The virus is an inverted carrying. It maintains a motif — the genome — by refracting the entire cost of production onto the host cell’s machinery. Replication is endogenous to the motif but exogenous to its cost.
The organism is a closure. It regenerates its own conditions of functioning by drawing on its own margin. Perturbation alters it irreversibly — it scars, ages, dies. Its normativity is constitutive: remove the distinction between what maintains it and what compromises it, and you remove the organism. It is this process — not a “fixed essence” — that grounds identity as dynamic persistence.
The institution is a conditional case. If it endogenously regenerates its critical constraints — recruitment, training, sanction, transmission — it operates as a closure. If it holds together only through external support, it is in a carrying regime. The criterion does not classify “the institution in general”: it asks, for this particular institution, who pays when it breaks?
The clinical symptom is a parasitic sub-closure. A compensatory response that initially succeeded locally — avoidance, dissociation, defensive rigidification — becomes autonomous as a self-maintaining sub-closure. It possesses its own regeneration cycle, its own local normative partition, and actively resists dissolution — even when dissolution would benefit the encompassing closure. The symptom protects first: it seals a breach by reducing local exposure. Then it governs: its own maintenance ends up consuming more margin than the breach it was sealing.
The crystal is an aggregate. Remarkable persistence, no metabolisation, negligible thermodynamic exposure at human scale. No cycle, no normativity, no ontodynamic individuation.
Six objects, one test, six verdicts. Few rival frameworks produce these discriminations with so unified an operator.
4. Why existing frameworks fall short
Maturana and Varela’s autopoiesis had the decisive merit of describing operational closure as a biological fact. But it does not derive closure from more primitive principles. It identifies a regime of the living; it does not generate closure as the consequence of a minimal axiomatics. Montévil and Mossio formalise the closure of constraints as a criterion — without generating it either. Closure is observed or postulated, never produced.
Friston’s Free Energy Principle gathers very different systems — droplets, thermostats, organisms, brains — under the same grammar. This unifying power is real, but it pays for its unity with a loss of discrimination: systems that Ontodynamique separates by the locus of cost-bearing receive a formally homogeneous description. An oil droplet and an organism both minimise “surprise” — but one scars under perturbation, the other does not. The problem is not that the FEP is wrong; it is that it does not discriminate where the ontodynamic criterion can. Clark’s parity principle poses a symmetrical problem: by treating as “cognitive” any functionally equivalent process, it erases the distinction between what is endogenous and what is carried.
Process-based approaches — Whitehead, Latour, Simondon — are right against static metaphysics: being is not an inert block. But if everything becomes process, relation, network, event, a principle is still missing for distinguishing what sustains itself from what is merely carried, or merely aggregated. Connecting is not yet discriminating. Latour’s withdrawal test — remove the actor and observe whether the network changes — is subsumed by the ontodynamic gradient, which further distinguishes whether the withdrawal erodes the closure, triggers a regime shift, or has no measurable effect.
Ontodynamique is not in frontal competition with these frameworks — it occupies a terrain they leave vacant: that of a single demarcation criterion, axiomatically derived, applicable without switching grammar between domains.
5. The default slope: why dissolution is the norm
Most frameworks ask: why do things die? Ontodynamique reverses the question: why do some things hold?
The reversal comes from the axioms. If to be is to make oneself, then all maintenance has a cost. That cost is incompressible — strictly positive, it can be neither cancelled nor bypassed. The margin is finite. Every transformation is irreversible: going back is a distinct transformation, with its own cost. The Whole exerts a permanent pressure of dissolution on every finite being, independently of any encounter with another finite being.
From this follows a central result: any finite structure subject to uncompensated decay exhausts itself in a finite number of steps. Exhaustion is the theorem; persistence is the phenomenon to be explained. Dissolution is the default attractor. Tar — the tangle of polymers that never self-organises — is the normal outcome, not an accident. Benner’s “Tar Paradox” is a paradox only for frameworks that predict self-organisation as a generic tendency. For Ontodynamique, the question “why closures rather than tar?” splits in two: the possibility is ontological, the realisation is empirical.
Everything dies because existing costs. Closure does not suppress dissolution — it postpones it. The organism does not resolve mortality; it negotiates it, cycle after cycle, drawing on a margin that wears down. Metabolisation postpones the deadline, it does not cancel it. Every finite being under exposure remakes itself or unmakes itself: it regenerates its own conditions, or it dissolves. There is no third way. Passive persistence does not constitute individuation.
6. What Ontodynamique is not
It is not a vitalism. The cost-bearing criterion applies equally well to institutions and software ecosystems as to organisms. It invokes no vital principle, no élan, no force irreducible to matter.
It is not a metaphor. The system is formalised in Lean 4 — 648 theorems, zero sorry, no domain axiom added. Results are proven theorems or explicitly labelled testable plausibilities. The deductive chain is public and machine-verifiable.
It is not primarily a theory of consciousness. Subjectivity arrives late in the deductive chain. The core of the system is cost, closure, and the composition gradient. One can reject the extension to subjectivity tiers and retain the entire structural trunk.
It is not a morality. The derived normativity is first-personal — it grounds a cycle’s viability, not a universal prescription. What it does produce is a structural constraint on parsimony: preserve only the essence, add only by necessity — not as advice, but as the local law of finite beings.
It is not a reductive physicalism. Cost is a structural invariant that refracts differently across levels of organisation. The higher-level closure is not reducible to its components — that is a proven theorem. The irreducibility of levels is demonstrated, not postulated.
7. A formal theory, not a metaphor
The entire system rests on two independent axioms. Axiom I — To be is to make oneself: the cost of maintenance is drawn from the very structure that maintains. Axiom V — Exteriority admits of degrees: partial alteration is the generic regime of interaction. “Axiom” does not mean “self-evident truth”: it means explicit starting point. If you reject these starting points, the results cease to apply — and the system tells you exactly what you lose and what you keep.
From these two axioms, the deductive chain produces 648 theorems formalised in Lean 4, zero sorry, no domain axiom added. Every proposition carries an explicit strength marker: strict deduction, constructibility, testable plausibility, proven undecidability. Ontodynamique does not ask for intuitive adherence, but that its premises be judged downstream — by what they allow to be derived, prohibited, and tested.
And the system tests. It excludes five configurations, each refutable by a single counterexample. The normalised ratio of compensatory cost between structure perturbations and input perturbations has been probed in five causally disjoint domains — gut microbiome, coral reefs, cancer pharmacology, yeast, and software ecosystems — and converges between 1.42× and 1.84× on incomparable metrics, in systems that do not know each other. None of the rival partitions tested converges across domains. A pre-registered confirmatory replication meets all four of its decision criteria. The most discriminating test — a pre-registered protocol on depersonalisation-derealisation — is underway.
8. A mortal theory
By its own self-reference result, the system acknowledges that it is itself an operative invariant carried by the finite closures that metabolise it — struck by constitutive opacity, exposed to drift, mortal. This is not a rhetorical concession to humility. A theory that claims every finite being is opaque to itself cannot exempt itself from its own theorems.
A finite being is not a thing that would first be there and then act. It is a difference that holds only by remaking itself. A persistence that is paid for. A form that subsists only by consuming its own margin against a pressure that never ceases.
Before asking what a thing is, one must perhaps ask the harder question: how does it hold, and who pays for it to hold?
For the philosophical starting point: What if the real problem isn’t substance vs process? (why the being/doing cut is the problem). For the intuition in four words: To be is to make oneself. For the full deductive chain: the standalone summary of the system.