Emergent Complexity: The Fourth Law of Thermodynamics?

The transfer of energy dictates everything on earth from the movement of atoms to the global economy. In high school/first-year chemistry we learn that the rules governing the movement of energy are simply defined by three laws of thermodynamics (four if you count the zeroth law). Yet, this simplicity can be misleading –  as demonstrated by how often the second law is misunderstood, misused and abused. The second law states:

The entropy of closed systems undergoing real processes must increase.

For some people the second law translates to “everything progresses from order to disorder” or “it is impossible for complexity to arise from randomness.” The biggest promoters for this misguided interpretation are advocates for intelligent design and/or irreducible complexity, which are just thinly veiled pseudonyms for creationism. They argue that complex systems like the flagellum or the human eye could not evolve spontaneously because they are complex – A logically precarious stance to take since these claims have been thoroughly debunked by evolutionary biologists.1

A quick bit of reflection on our day-to-day lives produces examples of complexity arising from less complex components. Ants, neurons, and transistors are just some examples of small building blocks that become infinitely more complex systems when combined in the right circumstances.

It is easy to argue that the above examples are the result of agency but there are also many examples of objects naturally arranging themselves into complex structures. In fact, the natural world is very good at arranging atoms.  Diamonds, ice crystals, and polycyclic aromatic hydrocarbons are just a few examples. Ambipolar molecules are an especially useful illustration of this tendency. When these molecules come into contact with water they form beautiful monolayers, bilayers, micelles and other structures.

So, returning to the 2nd law of thermodynamics, the correct interpretation is that complex structures – like those listed above – are possible, but at the cost of increased entropy in the surrounding environment.2

The tendency for a system to self-organize, when given the right circumstance and some energy from the surrounding environment, is one of the most important phenomena we observe. Yet, this transition from energy to order is not obvious when looking at our current laws of thermodynamics. This has led some researchers to suggest it may be possible to formalize a fourth law of thermodynamics that describes how complex systems arise.

Robert Hazen, a geologist at George Mason University, as well as others have hypothesized that this new law would need to encompass the following four components:

  1. the number of units/elements/parts
  2. the degree/strength of the interaction between the units
  3. how energy into the system effects the units
  4. the changes in energy input

The right combination of these variables will result in a product that is greater than the sum of its parts. The goal for researchers is to produce a model that predicts the emergent structure, given n units in x volume with a degree of interaction y and energy input z.

Building a model like this is not a trivial task. There are a number of researchers, including physicists, chemists, and information theorists, currently attempting this feat.

Some of my favorite research on emergence is conducted by computer scientists. There are computer programs where users provide simple instructions to small, randomly organized subunits and then watch for spontaneous generations of complexity. One of the most famous of these programs is John Conway’s Game of Life. In Game of Life (link to download page) a grid of black and white squares switch color based on the color of the adjacent squares. These seemingly random interactions can result in simple species like “gliders” or complex systems like the ones in the video below:

These black and white little creatures do not feed, but they do exhibit many other characteristics necessary for evolution. Their collisions (selection events) can lead to replication, death, an entirely new creature or nothing at all. The study of these programs is an example of bottom up research into the emergence of complexity.3

Top down research is also underway by, for example, physicists who study the formation of sand ripples under water waves (pdf). By manipulating the variables (amount of sand, frequency of waves, etc.) they are attempting to uncover that the equations that predict ripple emergence.

There is a small subset of chemists that are particularly interested in emergent phenomena: those who are trying to understand/recreate abiogenesis, the generation of living organisms from non-living building blocks. Abiogenesis is dependent on a series of emergent events. These include cell membrane formation, self-replicator formation, protein/RNA/DNA folding and a myriad of other emergent events that have defined life on earth.

Recently, the number of chemists researching self-assembly and nano- to micrometer-sized formations – also emergent phenomena – has significantly increased due to the impending end to Moore’s law and the promise of nanotechnology. These phenomena usually involve the spontaneous emergence of structures through the control of reaction conditions.

I feel confident that chemists are particularly well suited to study emergence and contribute to the formulation of a fourth law of thermodynamics. We know how to control the concentration of subunits, the interactions between these subunits based on molecular structure or nanoparticle surfactants, the polarity of the solvent, and other variables in order to achieve specific emergent phenomena. Note the similarities between this list of variables and Professor Hazen’s list (1-4 above). Through the sheer force of combinatorial chemistry we are already inadvertently making progress towards this goal. It may just be that we need to put all of these pieces together in one model.

There is one significant caveat I have to mention when talking about emergent phenomena: I have used the word complexity several times relying on the intuitive sense of what it means. Yet, there is no universally accepted definition of complexity. How do we quantify this term in a way that would allow us to compare the complexity of a micelle to the complexity of the human brain?

The unfortunate answer, as pointed out by Hazen in Genesis: The Scientific Quest of Life’s Origins, is that we may simply not have the tools or language necessary to define complexity yet. Hazen likens it to the effort to define water before understanding atoms.  We now know it as H2O, but before defining water’s atomic constituents there was no binding theme. Is it a solid, liquid or gas? The answer is ‘yes’. A similar conundrum emerged when scientists tried to classify the interrelation between animals before the theory of evolution. Do we group them by size, shape, color…?4

I really hope that a new thermodynamic law describing emergence is on the horizon. I find satisfaction in encompassing theories that, like evolution, are simple, elegant, and obvious in retrospect.


[1] For an entertaining/informative decimation of intelligent design watch this lecture by Prof. Kenneth Miller (one hour lecture).

[2] For a particularly thought-provoking type of emergent complexity read up on Boltzmann Brains.

[3] Here are some more interesting results from John Conway’s Game of life.  One of my favorites demonstrates a dynamic Droste effect (Take note of the beginning shapes before it zooms out).

[4] Many of the examples I used in the above post were taken from “Genesis: The Scientific Quest of Life’s Origins” by Robert M. Hazen and “Emergence: From Chaos to Order” by John h. Holland.


  1. Kenneth Hanson says:

    If one of you do decide to take the challenge, because of this post, and successfully define the fourth law of thermodynamics, feel free to give a shout out to chemistry-blog.com in your Noble Prize acceptance speech.

  2. How do you feel about Kolmogorov complexity as a starting point for thinking about chemical complexity? http://en.wikipedia.org/wiki/Kolmogorov_complexity

    The statement “the Kolmogorov complexity of an isolated, non-equilibrium system must reach a maximum” would appear to be empirically true, but I haven’t thought of a way to frame it as a useful law of thermodynamics like DS>0

    • Kenneth Hanson says:

      I have discussed the concept of using code-length to define complexity with a few computer scientists. While most of them like the idea, they immediately dismiss it on logistical grounds. You would need to choose one programing language (a feat in itself) and then be able to code for a system as efficiently as possible (how do you prove/support that?). The number of lines of code necessary to describe a system will likely depend on the level of abstraction you use as a starting point. Does your code describe molecules, atom, quarks or simply a spherical cow?

      With all of that said, I do think that it is as good of a starting point as anyone has suggested.

      I thought about including a discussion of defining complexity based on computer code length in the post but decided against it because it is already long enough.

  3. Have you looked at the evolution of evolvability that was just in MIT technology Review http://www.technologyreview.com/view/428504/computer-scientists-reproduce-the-evolution-of/ ?

    It seems like the sort of method that they created there could be used given a different set of constraints or at least generalized to include the factors that you’re talking about.

  4. I’ve always thought (well not always, but I don’t remember when the idea first entered my head) that life is an emergent property of the universe.

    as you rightly state, complexity can spontaneously arise, and generally this results in increased entropy external to the system. (you called this entropy ‘a cost’ but I think since the drive is towards greater entropy that term is a bit misleading).

    One classic example is the formation of convection cells in oil baths heated from the bottom (and not stirred). Of course the reason these cells form is to enable more rapid return the system (oil bath) to equilibrium (even temperature).

    now the universe is a long way out of equilibrium (we have hot spots and cold spots and dense spots and sparse spots). And we know systems self-organise to enable a return-to-equilibrium. Life (things that are alive) are great at increasing the entropy of their surroundings (we all give off heat…). So is it possible that life is an emergent property of the universe?

  5. The court decision in which Intelligent Design was labelled “progeny of creationism”, “not science”, and “religious” is fascinating reading. Wikipedia gives a thorough overview at http://en.wikipedia.org/wiki/Kitzmiller_v._Dover_Area_School_District and a link to the entire ruling.

  6. Very timely post with yesterday’s paper in Cell showing a crude model of the cell cycle of a small bacterium (http://dx.doi.org/10.1016/j.cell.2012.05.044). As these models get better, it seems like they could be very good tools for exploring how emergent properties arise from simpler molecular processes.

  7. I appreciate your note that the second law is real not ideal.

    In contemplating your ideas and thinking about the math of thermodynamics, after many years in other fields, I ran across this site that also reminds us about the nature of various types of systems in terms of thermodynamics. http://www.physics.uc.edu/~sitko/CollegePhysicsIII/12-Thermodynamics/Thermodynamics.htm .

    I wonder where someone would be able to publish a solution to emergence in our plethora of journals these days? Will it be too simple to be believable? What emergent system will be definitive enough to satisfy the data requirements?

    In any case it is good for all scientists to ponder the nature of the natural world – and this physics problem certainly brushes across many sciences.

    Thanks for sharing your interesting blog.

  8. The 2nd Law as written states succinctly and perfectly exactly what is intended. You, as well as the ‘intelligent’ designers, ignore the “closed systems” boundary condition. With that boundary there is no need at all to add the redundant statement about “increased entropy in the surrounding environment”.

    • Kenneth Hanson says:

      While you are correct in saying that ‘intelligent’ designers do not understand the closed system nature of the second law, you are missing the context of the above quote and ultimately point of the post. The second law, or the other laws, of thermodynamics do not prohibit the emergence of complexity. If they did they would be demonstrably false and could be thrown out. However, they do not explain or predict why, in closed systems, complexity can/will emerge. This is the reason that some suggest a fourth law. A formalization of the fourth law would generally need to define, within a closed system, a region of emergent complexity and the surrounding environment. That is why the statement “increased entropy in the surrounding environment” is not redundant but potentially crucial to defining the fourth law.

  9. Are we talking here of equilibrium thermodynamics? The zeroth to third laws are equilibrium and therefore cannot deal with evolution. One example give of the forth law, if ever there should be one, is the Onsager Reciprocal relations. That has to do with non-equilibrium thermodynamics. They are not relevant at equilibrium.

    So to think of evolution in terms of equilibrium thermodynamics is a contradiction. If you want to talk about non-equilibrium thermo, then that is another story and there can be lots of debates. I see equilibrium thermo as a complete and closed topic and there is no need for a forth law.

    Those who advocate nonsense like intelligent design need to take revelation out of the equation.

    Besides thermodynamics does not do much more than give relationships between macroscopic things we measure. It does not give us numbers. Applications to models can work but you will have to leave the area of thermo and move to statistical treatments.

    I generally think that those who use pseudo science to promote revelation should be ignored. I actually think this is working because it has died down, I hope, and those left promoting it are blowing in the wind.

  10. Prediction is the Dual of entropy. The purpose of force is to make events more controllable, stable and predictable. If you send me an email I will send you more detail. The fourth law is based on target tracking and the perfect target is 100% democracy.

  11. Pingback: Where Does Complexity in Nature Come From? | stephenmartin17

  12. The 2nd law is wrong. Entropy is reversed everytime molecular kinetic energy(waste heat) is converted to bipolar molecular potential energy -evaporation.
    Go here and explain why the perpetual tilting sponge motor is unworkable. It’s really educational and fun.
    Gives you great insight into the 2nd law.


Leave a Reply

Your email address will not be published. Required fields are marked *