Maxwell’s Demon is Foiled by Measurement, Not By Memory Erasure.

This paper shows that once we take into account that gas molecules are quantum systems, we find that there is an entropy cost to measuring their positions, and that this is what prevents a Maxwell’s Demon from violating the second law of thermodynamics. The tradition of taking “memory erasure” (originated by Bennett and Landauer) as the savior of the second law is shown to be misguided and untenable. In short, Brillouin was essentially correct, but obscured his point by digressing into unnecessary information-theoretic notions; and we sharpen his argument by deriving an explicit expression for the entropy cost of measurement. We show that it coincides with the quantity attributed to Landauer’s Principle.

Updated on 5.12.25 with final version, accepted for publication in Foundations.

5 thoughts on “Maxwell’s Demon is Foiled by Measurement, Not By Memory Erasure.

  1. This is a fascinating paper. If I get this right, the idea is that there is a price that must be paid for making an observation/measurement in the quantum context? In classical mechanics, the observer has the ‘privilege’ of making observations from the ‘G-d’s-eye view’ that is presumed to be wholly unobtrusive. Classical observation is done ‘free of charge’ and without consequence. It is a one-way exchange from observed to observer in which the act of observation has no influence. Would that be equivalent to saying that, when making observations in the general scenario of classical physics, there is no ‘observation-related’ entropic cost? Whereas when observations are made in the quantum context, there is an unavoidable entropic cost involved? Because observation is, in the quantum context of the transactional interpretation, a two-way relational ‘Yin/Yang’ transactional exchange and not a one-way ‘Yang-only’ forced exchange in which the system examined is passive and unchanged in the process. I am wondering to what degree this relatively intuitive take is an accurate one. Thank you, Ruth.

  2. Thanks Gary,

    Of course, the ‘classical’ world doesn’t really exist and is only an idealization; so there is ALWAYS an entropic cost to any measurement. We can usually get away with ignoring it–but NOT when discussing matters of principle such as the ultimate nature of the 2nd law and whether it can be violated by a microscopic creature. In any case, the result in my paper is just plain old vanilla QM and doesn’t depend on any particular interpretation or even RTI. Heisenberg himself long ago noted that in order to observe anything (in the sense of localizing it), we have to bounce photons off it, and that alone is going to constitute an amount of entropy dS = dQ/T. Then of course once we take his uncertainty principle into account, along with a thermodynamically sound definition of “entropy,” we clearly find the entropy cost long neglected.

    Unfortunately, over the past half-century, discussion of Maxwell’s Demon has ignored this fact and instead pretended that one can have a “God’s eye view” of gas molecules. This has all been made possible by a pathological mixture of (i) obsession with ill-defined computational notions based on Shannon ‘information’ and (ii) conspicuous neglect of actual physics including the uncertainty principle. [Arguably, von Neumann started people down that garden path to nowhere when he advised Claude Shannon to call his ‘information’ quantity ‘Entropy’ “because nobody knows what that is.” But in fact, we do. It’s Q/T, and what someone happens to be ignorant about (quantifiable by “Shannon information”) is neither heat nor a temperature.]

  3. Thank you, Ruth. I am in complete agreement and have much more to say about this. I am presenting a paper in s couple weeks in Mexico City at the first international conference on ‘Anticipatory Systems and Rosennian Complexity’ which, I think, relates to these ideas at a fundamental level, drawing on the ideas of theoretical biologist, Robert Rosen OBM. Happy to didcuss further offline.

  4. Thank you, Ruth. I am in complete agreement and have much more to say about this. I am presenting a paper in s couple weeks in Mexico City at the first international conference on ‘Anticipatory Systems and Rosennian Complexity’ which, I think, relates to these ideas at a fundamental level, drawing on the ideas of theoretical biologist, Robert Rosen OBM. Happy to didcuss further offline.

  5. Sorry about the duplicated comment and frequent spelling mistakes related to trying to generate comments on an iPhone keyboard. My bad. My feeling is that Robert Rosen’s work on making a categorical distinction between the relational structure of systems that are living organisms versus the relational structure of systems that are mechanisms is deeply significant in all of this. Because living organisms engage in bilateral transactional processes. They resist formalization and being forced into strictly deterministic description. One might argue that they are characterizable as ‘conscious agents’ with free-will agency. Which raises the possibility of connection to the ideas that Karen Barad develops with regard to what she calls ‘agential realism’ as a philosophical position with ethical implications–developed in her book ‘Meeting the Universe Halfway‘. It seems to me that the fundamental implication of RTI is that both the emitter and absorber take active roles in the context of transaction and that transaction is a mediated exchange between ‘agents‘ who are active participants in a shared bilateral process. Wheras the classical Newtonian mechanical (CNM) scenario is one of an active ‘giver’ and a passive ‘receiver’. In CNM, the ‘receiver’ makes no ‘choice’ and exerts no real agency in the process. It takes whatever is thrown at it. There are no ‘ifs, ands or buts’ as far as the ‘absorber’ is concerned in the CNM situation. In the terminology of the ‘New List of Categories‘ of Charles Sanders Peirce, CNM implies pure ‘Secondness‘–the unmediated, ‘billiard-ball mechanics‘ scenario. But CNM works under limited circumstances where mechanistic determinism can be assumed because the interactional aspect can be ignored. Bohm and Hiley would say that the ‘Quantum Potential’ is negligibly small. And a non-formalizable living system which really manifests ‘Rosennian complexity’ can be assumed to be model-able as a formalizable mechanism which is a ‘simple system’ that has no ‘Rosennian complexity’. Which is a process of radical reduction with grave moral implications. Peirce would say that this is an act that eliminates ‘Thirdness’ or the possibility of mediation, and transforms a living world into a dead unmediated world of ‘Necessitarianism’ in which a nominalistic, strictly deterministic worldview becomes dominant. What Henri Bergson called the ‘Élan Vital’ is eliminated in this reductive process of mechanization, along with free-willed agency. What is brilliant about the RTI way of understanding the implications of quantum physics is that it formulates the dynamics in terms of bilateral interactions between entities which involves (presumably) mediated signaling that underlies the actual occurrence of an exchange of mass, energy, or information.

    I was wondering if the way that Karen Barad has formulated her ‘Agential Realism’ is of any interest or value to you?

Leave a comment