Skip to main content

Does Nature Break the Second Law of Thermodynamics?

In seeming defiance of the second law of thermodynamics, nature is filled with examples of order emerging from chaos. A new theoretical framework resolves the apparent paradox

Science has given humanity more than its share of letdowns. It has set limits to our technology, such as the impossibility of reaching the speed of light; failed to overcome our vulnerabilities to cancer and other diseases; and confronted us with inconvenient truths, as with global climate change. But of all the comedowns, the second law of thermodynamics might well be the biggest. It says we live in a universe that is becoming ever more disordered and that there is nothing we can do about it. The mere act of living contributes to the inexorable degeneration of the world. No matter how advanced our machines become, they can never completely avoid wasting some energy and running down. Not only does the second law squash the dream of a perpetual-motion machine, it suggests that the cosmos will eventually exhaust its available energy and nod off into an eternal stasis known as heat death.

Ironically, the science of thermodynamics, of which the second law is only one part, dates to an era of technological optimism, the mid-19th century, when steam engines were transforming the world and physicists such as Rudolf Clausius, Nicolas Sadi Carnot, James Joule and Lord Kelvin developed a theory of energy and heat to understand how they work and what limited their efficiency. From these nitty-gritty beginnings, thermodynamics has become one of the most important branches of physics and engineering. It is a general theory of the collective properties of complex systems, not just steam engines but also bacterial colonies, computer memory, even black holes in the cosmos. In deep ways, all these systems behave the same. All are running down, in accordance with the second law.

But despite its empirical success, the second law often seems paradoxical. The proposition that systems steadily run down seems at odds with the many instances in nature not only of disorganization and decay but also of self-organization and growth. In addition, the original derivation of the second law has serious theoretical shortcomings. By all rights, the law should not apply as widely as it does.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Many of the scientists who founded thermodynamics were conscious of these failings and sought to formulate a more complete theory, a task taken up in the 20th century by Lars Onsager, Ilya Prigogine, Sybren de Groot, Peter Mazur and others. Yet even their more sophisticated approach had limited applicability. My colleagues and I have recently made progress in solidifying the foundations of thermodynamics and extending it into new realms. We have confirmed that the second law is universal but also found that it is not nearly as gloomy as its reputation suggests.

Out of Balance
Thermodynamics is one of the most widely misunderstood branches of physics. Laypeople and scientists alike regularly use concepts such as temperature, pressure and energy without knowing their rigorous meaning and subtleties. But those of us who plumb the theory’s depths are acutely aware of the need to take care. The Achilles’ heel of thermodynamics is that, strictly speaking, it applies only when the system under study is in a quiescent state called equilibrium. In this state the system’s parameters, such as mass, energy and shape, have ceased to change. Putting two objects together at different temperatures makes heat flow from the hotter object to the colder. This process stops when both reach the same temperature—that is, when the two are in thermal equilibrium. From that point on, nothing changes.

A common example is when you put ice in a glass of water. The ice melts, and the water in the glass reaches a uniformly lower temperature. If you zoom in to the molecular level, you find an intense activity of molecules frantically moving about and endlessly bumping into one another. In equilibrium, the molecular activity organizes itself so that, statistically, the system is at rest; if some molecules speed up, others slow down, maintaining the overall distribution of velocities. Temperature describes this distribution; in fact, the very concept of temperature is meaningful only when the system is in equilibrium or sufficiently near it.

Thermodynamics therefore deals only with situations of stillness. Time plays no role in it. In reality, of course, nature never stands still, and time does matter. Everything is in a constant state of flux. The fact that classical thermodynamics is limited to equilibrium situations may come as a surprise. In introductory physics classes, students apply thermodynamics to dynamic systems such as car engines to calculate quantities such as efficiency. But these applications make an implicit assumption: that we can approximate a dynamic process as an idealized succession of equilibrium states. That is, we imagine that the system is always in equilibrium, even if the equilibrium shifts from moment to moment. Consequently, the efficiency we calculate is only an upper limit. The value that engines reach in practice is somewhat lower because they operate under nonequilibrium conditions.

The second law describes how a succession of equilibrium states can be irreversible, so that the system cannot return to its original state without exacting a price from its surroundings. A melted ice cube does not spontaneously re-form; you need to put it in the freezer, at a cost in energy. To quantify this irreversibility, the second law introduces a key quantity: entropy. Entropy is popularly described as the degree of disorder in the system, but as I will discuss later, this description can be misleading. Quantitatively, entropy is the amount of heat exchanged in a process divided by the temperature. In an isolated system, entropy always stays the same or increases.

For instance, a typical engine works by exploiting the flow of heat from a hot to a cold reservoir, which are two large masses exterior to the engine mechanism. If the reservoirs maintain a constant temperature and the engine parts are frictionless, the engine goes through its cycle in a completely reversible way; the total entropy remains constant. In a real engine, these idealizations do not apply, so the cycle is irreversible and the total entropy increases. Eventually the engine runs out of available energy, heat ceases to flow and entropy reaches a maximum value. At that point, the reservoirs and engine are in equilibrium with one another and will remain that way, unchanged, from then on.

The fact that classical thermodynamics presumes equilibrium situations limits the applicability of the second law. Entropy and temperature cannot even be defined unless the system is in equilibrium. Moreover, many systems cannot be modeled as a heat engine. The cosmos is one: if space is expanding, entropy can increase without limit, so that the universe approaches but never reaches equilibrium [see “The Cosmic Origins of Time’s Arrow,” by Sean M. Carroll; Scientific American, June]. What these systems have in common is that they are not in equilibrium or even close to it.

Order from Chaos
Nonequilibrium systems behave in some fascinating ways that the classical theory of thermodynamics does not capture and that belie the idea that nature tends to become steadily more disordered. For instance, consider a familiar appliance, the electric toaster. The wire inside it heats up because the wire material offers resistance to the flow of electric current. The second law stipulates that this process is irreversible: you cannot use a toaster to untoast a piece of bread and thereby generate electricity.

You can, however, do something similar. You can impose a temperature difference between the tips of the toaster wire, thereby ensuring the system remains out of equilibrium. Then it will indeed generate electricity. This reversal is the basis of the thermocouple, a device used to measure temperature or produce power.

A related phenomenon is reverse os­mosis for seawater desalination. In standard osmosis, the difference in salt concentration across a membrane creates a difference in pressure, ensuring that water flows to the saltier side and dilutes it. The system thereby approaches equilibrium. In reverse osmosis, an external pressure keeps the system out of equilibrium, forcing water to flow over to the less salty side and become potable.

The toaster and thermocouple, and forward and reverse osmosis, are mirror-image processes. They are connected by the so-called reciprocity relation, the formulation of which won Onsager the 1968 Nobel Prize in Chemistry. The symmetry between these processes reflects the reversibility of the laws governing the motion of the particles of the system. Those laws work equally well backward or forward in time. The irreversibility we observe at a macroscopic level arises only when we consider particles en masse.

The discovery of the reciprocity relation changed how physicists think of equilibrium. They used to think of it as the most highly ordered state. Although the molecules may be maximally disordered, the system overall is placid, symmetrical and orderly. Yet the reciprocity relation exemplifies how a nonequilibrium system, too, can be highly ordered. Regularities, symmetries and islands of tranquility may come up in situations far from equilibrium.

Another classic example is a thin fluid layer heated from below. Heat flows from the bottom to the top, and a temperature gradient develops across the layer. By increasing the gradient, one can increase the departure from equilibrium. For modest gradients, the fluid remains at rest. For larger gradients, however, it begins to move. Its convective motion, far from being chaotic, is orderly. Small hexagonal cells form as if the fluid were a crystal. For even larger gradients, the motion becomes turbulent. This phenomenon, known as the Bénard problem, demonstrates that order can shade into chaos and back to order as a system deviates from equilibrium.

In yet another example, an experimenter begins with a fluid at rest. The fluid is isotropic: it looks the same in every direction. The experimenter then forces the fluid to pass through a metal grid at a certain speed. Although the fluid becomes turbulent on the downstream side, its motion still takes place in one direction. Thus, the fluid is no longer isotropic. As the experimenter increases the speed of the fluid, the turbulence increases and eventually becomes so great that the fluid no longer flows one way. At this point, the fluid is again isotropic. The fluid has gone from isotropic to anisotropic and back to isotropic—a type of progression from order to disorder to order.

Standard thermodynamics does not capture such phenomena, a limitation that has become all the more pressing in recent years. Researchers in molecular biology and the nascent field of nanotechnology have discovered a great diversity of organized but ever changing structures in physical, chemical and biological systems. To explain them requires a theory of nonequilibrium thermodynamics.

Breaking It Down
Earlier efforts to develop such a theory started from the concept of local equilibrium states. Although a system may not be in equilibrium, individual pieces of it can be. For instance, imagine stirring a cocktail with a swizzle stick. The equilibrium is disturbed by the motion of the stick but can still be found if you look closely at small pockets of fluid, which retain their internal coherence. These small regions are able to reach equilibrium if the forces acting on the system are not too large and if its properties do not change by large amounts over small distances. Concepts such as temperature and entropy apply to these islands of equilibrium, although the numerical values of these quantities may vary from island to island.

For instance, when one heats up one of the ends of a metal bar, heat flows through the bar toward the other end. The temperature difference between the ends of the bar acts as a force driving the heat flow, or flux, along the bar. A similar phenomenon occurs with a drop of ink in water. The difference in ink concentration is the driving force that makes the ink invade the host liquid until it becomes uniformly colored. These forces are linear: the heat flux is proportional to the temperature difference and the particle flux to the concentration difference, a proportionality that holds even when the forces acting on the system are strong. Even in many turbulent flows, the internal stresses in the fluid are proportional to the velocity gradients. For these cases, Onsager and others formulated a theory of nonequilibrium thermodynamics and showed that the second law continues to hold.

But when those conditions are not met, this theory breaks down. When a chemical reaction takes place, one substance suddenly changes into another—an abrupt change described by a nonlinear equation. Another type of failure occurs when the system is so small that the chaotic jumble of molecular motions dictates its behavior and causes the system’s properties to vary wildly over short distances. Processes taking place in small systems, such as the condensation of water vapor and the transport of ions through a protein channel in a cell membrane, are dominated by such fluctuations. In them, temperature and entropy cease to be well-defined quantities. Does the failure of the theory in these instances imply the failure of the second law, too?

In the past several years David Reguera of the University of Barcelona, José M. G. Vilar of the Sloan-Kettering Institute and I have extended thermodynamics into these realms. We have shown that many of the problems go away with a change of perspective. Our perception of abruptness depends on the timescale we use to observe these processes. If we analyzed one of the seemingly instantaneous chemical processes in slow motion, we would see a gradual transformation as if we were watching a pat of butter melting in the sun. When the process is viewed frame by frame, the changes are not abrupt.

The trick is to track the intermediate stages of the reaction using a new set of variables beyond those of classical thermodynamics. Within this expanded framework, the system remains in local thermodynamic equilibrium throughout the process. These additional variables enrich the behavior of the system. They define a landscape of energy that the system rambles through like a backpacker in the mountains. Valleys correspond to a dip in energy, sometimes involving molecular chaos, other times molecular order. The system can settle into one valley and then be kicked into another by external forces. If it is in the grasp of chaos, it can break away from disorder and find order, or vice versa.

Next, consider the problem of fluctuations. Does thermodynamics fail when systems are excessively small? A simple example shows that the answer is no. If we toss a coin only a few times, it could happen, by chance, that we would get a series of heads. But if we flip the coin many times, the result reliably approaches an average. Nature flips coins quite often. A few particles moving around in a container collide only occasionally and can maintain large velocity differences among themselves.

But even in a seemingly “small” system, the number of particles is much larger, so collisions are much more frequent and the speed of the particles is brought down to an average (if slightly fluctuating) value. Although a few isolated events may show completely unpredictable behavior, a multitude of events shows a certain regularity. Therefore, quantities such as density can fluctuate but remain predictable overall. For this reason, the second law continues to rule over the world of the small.

From Steam Engines to Molecular Motors
The original development of thermodynamics found its inspiration in the steam engine. Nowadays the field is driven by the tiny molecular engines within living cells. Though of vastly differing scales, these engines share a common function: they transform energy into motion. For instance, ATP molecules provide the fuel for myosin molecules in muscle tissue to move along actin filaments, pulling the muscle fibers to which they are attached. Other motors are powered by light, by differences in proton concentrations or by differences in temperature [see “Making Molecules into Motors,” by R. Dean Astumian; Scientific American, July 2001]. Chemical energy can drive ions through channels in a cell membrane from a region of low concentration to one of high concentration—precisely the opposite direction that they would move in the absence of an active transport mechanism.

The analogy between large and small machines is very deep. Fluctuations of the chemical energy affect a molecular motor in the same way that a random and variable amount of fuel affects the piston of a car motor. Therefore, the long tradition of applying thermodynamics to large motors can be extended to small ones. Although physicists have other mathematical tools for analyzing such systems, those tools can be tricky to apply. The equations of fluid flow, for example, require researchers to specify the conditions at the boundary of a system precisely—a Herculean task when the boundary is extremely irregular. Thermodynamics provides a computational shortcut, and it has already yielded fresh insights. Signe Kjelstrup and Dick Bedeaux, both at the Norwegian University of Science and Technology, and I have found that heat plays an underappreciated role in the function of ion channels.

In short, my colleagues and I have shown that the development of order from chaos, far from contradicting the second law, fits nicely into a broader framework of thermodynamics. We are just at the threshold of using this new understanding for practical applications. Perpetual-motion machines remain impossible, and we will still ultimately lose the battle against degeneration. But the second law does not mandate a steady degeneration. It quite happily coexists with the spontaneous development of order and complexity.

Note: This story was originally printed with the title, "The Long Arm of the Second Law".