
(Murray, 2024)
TABLE OF CONTENTS
1. Introduction
2. Cronin-Walker Assembly Theory and the Anti-Reductionist Turn
3. Physics is Not Causally Closed: Nicolas Gisin’s Anti-Mechanism
4. Barbara Drossel’s Anti-Reductionism
5. Donald Hoffman’s Case Against Reality: There are No Brains
6. Colin McGinn’s Basic Structures of Reality: A Philosophical Analysis of Physics-Based Metaphysics and Structural Realism
7. Thomas Nagel’s Mind and Cosmos: A Defense of Teleological Naturalism and a Critique of Materialist Reductionism
8. Kevin Mitchell’s Free Agents: A Biological Case Against Mechanistic Determinism
9. Conclusion
The essay below will be published in six installments; this, the first, contains sections 1-2.
But you can also download and read or share a .pdf of the complete text of the essay, including the REFERENCES, by scrolling down to the bottom of this post and clicking on the Download tab.
Anti-Reductionist Philosophy of Science Against Mechanism
1. Introduction
The doctrine of mechanism is defined by Robert Hanna as follows:
[E]verything in the world is fundamentally either a formal automaton or a natural automaton, operating strictly according to Turing-computable algorithms and/or time-reversible or time symmetric deterministic or indeterministic laws of nature, especially the Conservation Laws (including the First Law of Thermodynamics) and the Second Law of thermodynamics, which also imposes always-increasing entropy—i.e., the always-increasing unavailability of any system’s thermal energy for conversion to causal (aka “mechanical”) action or work—on all natural mechanisms, until a total equilibrium state of the universe is finally reached. (Hanna, 2024: p. 23)
One of the principal motivations for mechanism is the belief that science requires unification by top-to-bottom reduction of the human and social sciences to biology, biology to chemistry and chemistry to physics, so on this view, consciousness is a nomological dangler in need of some type of reductionist explanation. In this essay, we will examine works that reject this core meta-science assumption of scientific unificationism.
We should note that the mechanistic and physicalist idea that reduction has been so successful in science and that consciousness is a singularity, open to challenge. However this is not so: reductionism in biology is false; cells are not “computers” and genes are not their “program” (Krimsky and Gruber et al., 2013; Noble and Noble, 2023; Ball, 2024). Biology is not reducible to chemistry or chemistry to physics, despite providing a basic for such sciences. (Primas, 1983; Horst, 2007; Del Santo and Gisin, 2019) In this essay, we will consider scientists and scientific philosophers who have sought to develop comprehensive non-mechanistic accounts of reality. The basic point to be made in this essay is that there are defensible, rational, naturalistic alternatives to the reductionist worldview of mechanism.
The following discussion has these seven sections:
2. Cronin-Walker Assembly Theory and the Anti-Reductionist Turn
3. Physics is Not Causally Closed: Nicolas Gisin’s Anti-Mechanism
4. Barbara Drossel’s Anti-Reductionism
5. Donald Hoffman’s Case Against Reality: There are No Brains!
6. Colin McGinn’s Basic Structures of Reality: A Philosophical Analysis of Physics-Based Metaphysics and Structural Realism
7. Thomas Nagel’s Mind and Cosmos: A Defense of Teleological Naturalism and the Critique of Materialist Reductionism
8. Kevin Mitchell’s Free Agents: A Biological Case Against Mechanistic Determinism
Sections 2, 3, 4, 5 and 8, are contributions from established mainstream scientists who are opposed to reductionism as advocated by physicalists and mechanists. Sections 6 and 7 outline two leading philosophers’ works which also reject mechanism. Obviously enough none of these works, or any work for that matter, are beyond controversy, and the books by McGinn and Nagel have been subjected to some savage attacks by their opponents. Here, however, our main concern is to outline anti-mechanism approaches for those who are interested in the topic, but are not active researchers; we have in mind here students, or academics working in other fields. Given the ground to be covered, the review will be deliberately thin on criticisms, beyond the most substantial, in order to cover the whole area. And the positions taken as a whole are not always mutually consistent: for example, David Hoffman defends an anti-realist psychology, and metaphysics, while others in the survey are broadly realist. We will not attempt to answer the more general question about who is right; that’s a topic for another day. Readers can take this essay to be an anti-mechanistic tool kit, where some tools may ultimately prove to be unsatisfactory, but others more fit for anti-mechanistic purposes.
2. Cronin-Walker Assembly Theory and the Anti-Reductionist Turn
The tension between reductionist and anti-reductionist approaches has been central to debates in complexity science, biology, and philosophy of science. Lee Cronin, a chemist at the University of Glasgow, and Sara Walker, an astrobiologist and theoretical physicist at Arizona State University, have developed Assembly Theory as a framework that explicitly challenges reductionist assumptions about complex systems (Cronin and Walker, 2016; Marshall et al., 2017, 2019; Sharma et al., 2023; Walker, 2017; Walker and Davies, 2013; Walker et al., 2016). Their collaboration represents a significant departure from traditional approaches that attempt to understand complexity through decomposition into simpler parts or purely informational descriptions.
Assembly Theory emerged from practical concerns about detecting life and complex objects, but its implications extend far beyond astrobiology. The theory proposes that complex objects possess irreducible historical dimensions that cannot be captured by analyzing their constituent parts or final states alone. This anti-reductionist stance has profound implications for how we understand causation, information, and the emergence of novelty in complex systems.
Assembly Theory’s central innovation is the concept of the assembly index (AI), defined as the minimum number of recursive assembly steps by which from its basic building blocks, an object can be constructed. Unlike traditional complexity measures that focus on information content, algorithmic complexity, or thermodynamic properties, the assembly index captures the irreducible historical dimension of complex objects.
The assembly index differs from Shannon information or Kolmogorov complexity in crucial ways. While these measures focus on the informational content or shortest description of an object, the assembly index measures the minimum causal pathway required for the object’s construction. This distinction is fundamental to Assembly Theory’s anti-reductionist character: the theory insists that the construction pathway is an ontological feature of the object, not merely an epistemological tool for description.
For example, two molecules with identical composition might have different assembly indices if they require different numbers of construction steps. Traditional reductionist approaches would consider these molecules equivalent if their constituent atoms are identical, but Assembly Theory maintains that their different assembly histories make them fundamentally different objects.
Assembly Theory emphasizes recursive construction processes where simpler objects are combined to create more complex ones, which can then serve as building blocks for even more complex objects. This recursive structure creates hierarchical levels of organization that cannot be reduced to the properties of the most basic level.
The recursive nature of assembly processes means that complex objects carry information about their construction history in their very existence. This information is not encoded separately from the object but is constitutive of what the object is. A molecule with high assembly index could not exist without the specific sequence of assembly operations that created it, making its history an irreducible aspect of its identity.
A crucial aspect of Assembly Theory is the copy number threshold—the idea that objects with high assembly indices are unlikely to exist in multiple copies unless produced by evolutionary or selective processes. This connects Assembly Theory to questions about life and selection: objects that are both complex (high assembly index) and abundant (high copy number) indicate the presence of mechanisms that can reliably reproduce complex assembly pathways.
This insight challenges reductionist approaches that treat selection as merely differential survival and reproduction. Assembly Theory suggests that selection is fundamentally about the ability to maintain and reproduce complex assembly pathways, making the construction process itself the target of selection rather than just the final product.
Assembly Theory’s most fundamental anti-reductionist commitment is the claim that the construction history of complex objects is irreducible. Traditional reductionist approaches assume that if we understand the parts and their interactions, we can understand the whole. Assembly Theory argues that this approach misses essential information about how the object came to be assembled.
This historical irreducibility is not merely epistemic—it is not that we find it convenient to track construction histories, but rather that these histories are ontologically fundamental features of complex objects. Two objects with identical final states but different assembly pathways are genuinely different objects according to Assembly Theory.
The theory therefore proposes that reductionist approaches fail because they ignore the temporal dimension of complex systems. Understanding a complex object requires understanding not just what it is made of, but how it was made, and this “how” cannot be reduced to the “what.”
Assembly Theory focuses on pathway-dependence, the idea that the specific sequence of assembly steps matters for understanding complex objects. This challenges reductionist approaches that focus on equilibrium states or final configurations while ignoring the pathways that led to them.
The theory argues that causal structure is preserved in complex objects through their assembly pathways. Each step in the assembly process represents a causal relationship that becomes embedded in the final object. This embedded causality cannot be recovered from analysis of the parts alone but requires understanding the assembly process.
This perspective aligns with process philosophy traditions that emphasize becoming over being, but Assembly Theory also provides a quantitative framework for analyzing these processes. The assembly index serves as a measure of accumulated causal depth, the number of causal steps embedded in an object’s construction.
Unlike approaches that treat information and matter as separate categories, Assembly Theory integrates information and materiality through the concept of assembly. The assembly index represents information, but this information is not abstract, it is materially instantiated in the construction pathway of the object.
This integration challenges both materialist reductionism (which reduces everything to matter and energy) and informational reductionism (which treats matter as merely a substrate for information processing). Assembly Theory suggests that complex objects are irreducibly information-material hybrids where information and matter are co-constituted through assembly processes.
The theory proposes that information in complex systems is not code-like (separate from its material substrate), but more like the information in a sculpture, where the information cannot be separated from the material process of its creation. This perspective has implications for understanding biological systems, where genetic information and material processes are often treated as separate categories.
Assembly Theory offers a novel perspective on the nature of life that differs from both genetic/informational and metabolic/thermodynamic approaches. According to Assembly Theory, life is characterized by the ability to generate and maintain objects with high assembly indices through selective processes.
This definition avoids traditional debates about whether life is primarily about information processing, metabolism, or reproduction by focusing on the ability to produce complex assembled objects. Living systems are those that can reliably produce objects that would be extremely unlikely to exist without selective processes, objects that are both complex and abundant.
The theory suggests that the origin of life should be understood as the emergence of processes capable of producing objects with assembly indices above the copy number threshold. This shifts focus from questions about the first replicator or metabolic network to questions about the emergence of assembly processes capable of producing complex objects.
Assembly Theory reconceptualizes evolution as exploration of “assembly space,” the space of all possible assembly pathways. Natural selection becomes a process of discovering and maintaining pathways through assembly space that lead to objects with high survival and reproduction potential.
This perspective challenges reductionist approaches to evolution that focus on gene frequencies or organism-environment interactions. Assembly Theory suggests that evolution operates on assembly pathways themselves, not just on the objects they produce. Mutations and other evolutionary processes are reinterpreted as perturbations in assembly space that may lead to new pathways.
Assembly Theory provides a framework for understanding evolutionary innovations as discoveries of new regions in assembly space. Major transitions in evolution correspond to the opening of new areas of assembly space that were previously inaccessible.
Assembly Theory was originally developed in order to address the biosignature problem: how to detect life in contexts where we cannot make assumptions about its specific biochemical basis? Traditional approaches to life detection rely on searching for familiar biochemical signatures, but this approach may miss life forms based on different chemical foundations.
Assembly Theory proposes that the combination of complexity (high assembly index) and abundance (high copy number) provides a universal biosignature that does not depend on specific chemical assumptions. This approach could detect life forms that differ dramatically from terrestrial biology while avoiding false positives from complex, but non-living systems.
This universal approach to life detection reflects Assembly Theory’s anti-reductionist character: rather than reducing life to specific chemical or informational processes, it focuses on general principles about assembly and selection that could apply to diverse material substrates.
Assembly Theory challenges the reductionist methodology of understanding wholes through analysis of parts. The theory argues that compositional analysis, determining what an object is made of, provides insufficient information for understanding complex objects.
Traditional analytical chemistry exemplifies reductionist methodology: understanding a molecule means determining its atomic composition and structural arrangement. Assembly Theory argues that this approach misses crucial information about how the molecule was constructed, information that is necessary for understanding its role in complex systems.
As a consequence, Assembly Theory claims that new analytical methods are needed that can determine assembly indices rather than just compositions. Such methods would represent a shift from reductionist to what we might call “constructionist” approaches to analysis.
Assembly Theory criticizes equilibrium-based thinking that dominates much of chemistry and physics. Equilibrium approaches assume that the important features of systems can be understood by analyzing their stable states, with dynamics treated as transitions between equilibria.
Assembly theory argues that, by sharp contrast, complex objects exist far from equilibrium and that their non-equilibrium character is essential to their identity. The assembly pathways that create complex objects represent irreversible historical processes that cannot be captured by equilibrium analysis.
This critique extends to statistical mechanics and thermodynamics, which Assembly Theory argues provide insufficient frameworks for understanding complex assembled objects. The theory calls for new approaches that can handle the historical and pathway-dependent character of complex systems.
Assembly Theory attempts to navigate beyond traditional debates between emergentism and reductionism by proposing that assembly pathways are neither reducible to nor merely emergent from their parts. Instead, the theory suggests that assembly processes are ontologically fundamental features of reality.
This position differs from strong emergence (which posits downward causation from higher levels) and weak emergence (which treats higher-level properties as epistemologically but not ontologically distinct). Assembly Theory proposes that assembly pathways are irreducible not because they exhibit novel causal powers but because they represent irreducible historical facts about how objects were constructed.
The theory entails that debates about emergence versus reduction miss the crucial temporal dimension of complex systems. Understanding requires neither reduction to parts nor appeal to emergent properties but attention to construction processes that are irreducible aspects of complex objects.
Assembly Theory shares significant conceptual ground with Alfred North Whitehead’s process philosophy (Whitehead, 1929), particularly the focus on becoming over being and the irreducibility of historical processes. Like Whitehead’s “actual occasions” of experience, Assembly Theory’s assembled objects carry their construction histories as irreducible features of their identity.
The theory’s emphasis on recursion and hierarchical construction resonates with Whitehead’s concept of “prehension,” whereby each actual occasion incorporates aspects of its predecessors (Whitehead, 1929). Assembly Theory provides a quantitative framework for analyzing these incorporative processes through the assembly index.
However, Assembly Theory differs from Whiteheadian metaphysics by focusing on material assembly processes rather than experiential becoming. The theory maintains a materialist ontology while insisting on the irreducibility of temporal processes.
Assembly Theory also connects with more recent work in systems biology and network theory that emphasizes pathway analysis over compositional analysis. The theory extends these approaches by insisting on the ontological significance of pathways rather than treating them merely as useful analytical tools.
Assembly Theory contributes to complexity science by providing new tools for understanding non-linear systems and the emergence of novelty. The theory’s emphasis on recursive construction and pathway dependence aligns with complexity science’s focus on non-linear dynamics and sensitive dependence on initial conditions.
The assembly index provides a way to quantify the complexity that emerges from non-linear assembly processes. Unlike measures based on information theory or network topology, the assembly index captures the temporal depth of construction processes.
The theory proposes that traditional complexity measures miss crucial aspects of complex systems by focusing on structural features rather than construction pathways. This limitation becomes particularly apparent when analyzing biological and technological systems where construction history is crucial for understanding function.
Assembly Theory calls for new experimental methodologies that can determine assembly indices rather than just compositional or structural features. This requires developing techniques that can trace construction pathways rather than just analyzing final products.
The development of assembly-focused experimental methods could transform fields from chemistry to archaeology by providing new ways to understand how complex objects were constructed. Such methods would represent a shift from reductionist analytical approaches to constructionist synthetic approaches.
Assembly Theory presents challenges for computational modelling because traditional simulation approaches focus on dynamics and equilibrium properties rather than construction pathways. Modelling assembly processes requires tracking historical information that is typically discarded in physical simulations.
The theory suggests the need for new computational approaches that can efficiently represent and manipulate assembly pathways. Such approaches might resemble program synthesis or automated theorem proving more than traditional physical simulation.
Assembly-based modelling could provide new tools for understanding biological development, technological innovation, and cultural evolution by focusing on construction processes rather than just final states or dynamic behaviors.
Assembly Theory’s anti-reductionist character makes it naturally interdisciplinary, connecting insights from chemistry, biology, physics, information theory, and philosophy. The theory provides a framework for integration that avoids reducing one level of description to another.
The theory entails that interdisciplinary collaboration should focus on understanding assembly processes that operate across traditional disciplinary boundaries. This could lead to new hybrid fields that combine insights from previously separate domains.
Some critics argue that Assembly Theory’s anti-reductionist claims are undermined by its potential reducibility to information-theoretic descriptions. If assembly indices can be computed from purely informational descriptions, this might suggest that the theory’s materialist emphasis is unnecessary.
The theory’s claims about the irreducibility of assembly pathways depend partly on showing that these pathways cannot be captured by purely informational measures. This remains an active area of debate and development.
In conclusion, Cronin and Walker’s Assembly Theory represents a significant anti-reductionist contribution to complexity science and our understanding of life. By insisting on the irreducible importance of construction pathways, the theory challenges fundamental assumptions of reductionist methodology while providing quantitative tools for analyzing complex systems.
The theory’s anti-reductionist character emerges from several key insights: that construction history is an ontological feature of complex objects; that assembly pathways cannot be reduced to compositional or structural analysis; and that information and materiality are integrated through assembly processes. These insights have implications extending far beyond the theory’s original astrobiological applications.
Assembly Theory’s position on process over product, construction over composition, and pathway over destination aligns it with broader anti-reductionist traditions in philosophy and science while providing novel quantitative frameworks for analysis. Assembly Theory’s anti-reductionist stance does not represent a retreat from scientific rigor but rather an expansion of scientific methodology to include construction processes that traditional approaches have neglected. The theory suggests that science itself must become less reductionist to fully understand the complex assembled objects that populate our universe, from molecules to living systems to technological artifacts. Assembly Theory has been primarily focused upon physics, chemistry and biology, but it still supplies a meta-scientific alternative to mechanism and scientific unificationism by means of a radical rethinking of the basic science of assemblies.

Against Professional Philosophy is a sub-project of the online mega-project Philosophy Without Borders, which is home-based on Patreon here.
Please consider becoming a patron!
