On the Importance of Randomness

What would happen if the water molecules of Lake Tahoe, instead of moving randomly, were to causally align? “Lake Tahoe would take off over the Sierra Nevadas at something like 1000 mph (based on the average speed of H2O molecules at room temperature as derived from simple kinetic theory).”  —Richard E. Dickerson Molecular Thermodynamics, pp. 37-8.

Dickerson’s point, in this intensively mathematical textbook on thermodynamics, is that random actions are not necessarily the evil phenomena that they are often portrayed as being, especially if we prefer to have our lakes staying in place rather than go shooting off into outer space. (The net outcome of the molecules going off in every random direction is that those actions cancel out and the water stays in place). Indeed, random actions are more common—and more necessary—than they are often given credit for. And they are at the heart of thermodynamics, given that the “thermo-” in thermodynamics means “heat,” and heat is the energy of random actions. Yet arguably, thermodynamics is a fundamental science.

I have explained in past posts that I am trying to show how a naturalistic worldview—one that does not appeal to elements external to the physical Universe such as Platonic Forms—can be gained by examining the traits of energy, which are entirely this-worldly. And to that end, I have been discussing in recent essays how energy makes arrangements of itself (as evidenced, for instance, in potential energy) and how many qualities of the physical world can be understood in terms of the structures it assembles. But my purpose now, in this essay, is to begin to introduce the other major trait of energy, which is that it enables things to move randomly. At a micro-scale level, that means that particles randomly vibrate, rotate, and translate (move from here to there), and then collectively that can create macro-scale phenomena. Examples of this latter I will presently be presenting.

One of the major problems with Plato (with his static eternal verities) is that he has no method for explaining change or even motion. But I have been arguing that science explains the world in terms of change (in terms of what we can do over again). So it is important to see how energy does not just create structures; it animates them.

And it is key to note how thermodynamics predates quantum mechanics. In other words, what I am about to argue does not depend on how we “interpret” quantum mechanics.

The evidence for randomness being a “real” quality of the physical world comes independently from Brownian motion (1827). Pieces of pollen floating on water can be observed to move randomly because the water molecules beneath them are moving randomly in a way related to their internal energy. The water molecules are randomly vibrating/rotating/translating as an internal property of their own.


Physical chemistry is the attempt to explain and predict how complicated molecules get made— and so how complexity arises—starting with the three theories of thermodynamics, kinetics, and quantum mechanics. Rather than going into the arcane myriad details, I will here offer my own simple analogy by which it is possible to visualize the process. The question to answer is: How can we start with random processes and end up with complexity?

Of course, it is by realizing that the random processes are occurring within arrangements. And that makes all the difference.

I realize that the following may seem very foreign to those schooled in traditional approaches to knowledge. So now I am not trying to convince that I am correct (although, obviously, I think I am) but rather I am merely hoping that readers might come to accept that, yes, this is a different way of looking at the world, and it is a way worth understanding.

Thus, here is the analogy (having forewarned you that it might seem bizarre).

Imagine that you have a large paper sac, and you put into it a lot of small blocks. Some blocks will have charges on them, so that those parts of the blocks will attract or repel similar parts of the other blocks. Also imagine that the blocks are irregularly shaped with protrusions and holes of various sizes, so that some protrusions will fit into some of the others’ holes, and others will not. And finally, imagine that the walls of the sac themselves have similar qualities. 

Now shake the sac so that the blocks within it are moving randomly. What will happen is that some of the blocks, based on their charges and topography, will combine with one another as the random jostling forces them to encounter one another. But also, many of these new combinations will fall back apart and be replaced by newer combinations. And in that way novel complexities will become put together (with these “blocks” really being molecules moving randomly within the confines of some structure). Usually it will make not just one outcome—it won’t make all one thing—but it will make several outcomes in predictable proportions.

Thermodynamics is the study of what combinations will be most stable. So as new combinations arise and fall apart, the most stable ones will be those that endure to make it to the final outcome. The prediction of stability is based on how much energy is required or saved by forming up into that new arrangement. It is like how it requires less energy to be leaning on one another than to be adding to a wavering tower. The stable arrangement wins out in the end.

But then kinetics is the study of how speed can trump stability. Kinetics shows that what counts for a successful outcome might be to get there first. One boy might be best suited to win the girl, but the boy who arrives first on the scene might have a done deal before the best guy can even show up. Kinetics also includes factors such as steric hindrance, meaning that if something gets in your way, you lose even if otherwise you would win. The best track star might lose the race if he is boxed in by other slower runners.

And quantum mechanics is the basis for molecular orbital theory, meaning that it provides the theoretical grounding for why atoms form bonds at all. I will leave that subject to part two of this essay except to hint that even quantum mechanics fits with this analogy (of describing random actions within arrangements) via its “particle in a box” derivations.

Hey, I warned you this would be different from Enlightenment-era views where events happen by being “directed” from outside of the Universe, either by external laws or by first causes.


The image of the particle-filled “sac,” of course, is just a metaphor to express the general idea of randomly moving particles within some kind of special environment (an arrangement, a setup). Following are some real examples in science that constitute more detailed illustrations of this approach to knowledge. I will try to provide enough examples to suggest how ubiquitous this phenomenon is without going so overboard as to be tedious about it.

Diffusion. One example is diffusion across a semipermeable membrane, such as a cell wall. The particles move randomly, and this random motion creates an even mixing of the particles. Yet in a slightly bigger picture we can see that this even mixing is moving the particles from an area of high concentration to one of lower concentration, such as oxygen molecules going from the air in the lungs into the blood, or from the blood into a cell. There is no acting as per things receiving instructions from a law (“Thou shalt cross the membrane”), but rather the setup of circumstances plus the randomness makes what is occurring.

Battery. If the semipermeable membrane is within a battery, and the particles are ions containing a charge, then their net movement creates a current flow. Again, there is no “Thou shalt” or a succession of causes about it. The setup creates how the random actions deliver a current.

Movement within a Cell. Nutrients and wastes move within a cell via the random collisions of its vibrating molecules until they reach a place where they are utilized or disgorged.

Enzymes. Enzymes work by holding reactants together in place, thereby catalyzing their reaction. But in order to hold the reactants, the enzyme first must bump randomly into many possible reactants before it finds the one(s) which fit like a hand into a glove with the enzyme.

Lining up within a Cell. A cell often works by requiring molecules to line up in a certain order. Examples are when ions line up in heart cells to discharge to make a contraction, or in neurons to discharge a nerve impulse, or when amino acids line up in the proper order to make a protein as per recorded in DNA. But the way they line up is not by following a law or by being pushed into position via a succession of causes but by the molecules moving randomly until they hit and so find a receptor which they can fit with.

Coupled Reactions. Some chemical reactions within a cell do not occur just by passive random actions but are powered by energy released from other chemicals (that energy ultimately coming from food). But this transfer of energy occurs via the random jostling of the molecules until the energy donor and all of the other necessary components of a reaction happen to meet.

Filtration. Filtration occurs, in both kitchen utensils and in the kidneys, as particles suspended in a liquid move randomly and so are either able, or not able, to fit through pores of a certain size.

Gene-Switching. Genes are able to switch on or off, as needed, and they do so as per the random accumulation of other molecules related to their work.

Gatekeeping. A similar phenomenon to gene-switching occurs with electrons in transistors and with photons in photonic crystals. I elaborated on that in the post on photonic crystals.

Chrystal Formation. The lattices of crystals are made of repeating units of atoms. But each unit of the crystal arrives at its place via random actions and then stays there (if it does) because of how its own features mesh with the surrounding elements.

Evaporation. Liquid water changes into water vapor as photons from the sun or other heat source randomly strike the water, giving the water molecules sufficient energy to turn from liquid to air.

Dissolving into Solution. A solid gradually dissolves in a solvent, and seems to disappear, as the molecules of solute and solvent randomly interact, changing what is attracted to what.

Precipitation. Molecules dissolved in a solvent can become too numerous to stay dissolved under those circumstances and so randomly precipitate as solids, as in raindrops precipitating from clouds. If the temperature is cold, the precipitates can take the form of randomly different snow crystals.

Combustion. Fire occurs as oxygen molecules randomly combine with a fuel source.

Static Electricity. Static electricity builds as surfaces randomly rub against one another.

Friction. Friction develops as a force as the holes and protrusions of surfaces rub against one another. There is no general rule from which the value of the friction can be determined because in each case the random distribution of holes and protrusions will be different.

Thresholds. In electronics, a capacitor fills randomly with electrons and then discharges its entire contents all-at-once. A similar phenomenon can occur with other containers and substances.

Darwinian Evolution. The survival of the fittest depends on chance encounters wherein there will be a survivor of a contest, with the survivor being the one having features best adapted to that environment. Hence descendants of the survivors of chance encounters will be adapted to that environment.

Hopfield Networks. Even neural nets, at least in the Hopfield formulation, work in a manner that is patterned after chemical reactions. Optimization occurs via the repetition of multiple trials each with a randomly different bias until a stability is achieved. (Recently, this approach has also been used to store memory. Fascinating stuff).

changing scales

What we can see happening in each of these examples is that a macro-scale accomplishment is gained (a current flows from a battery, a liquid is filtered of debris, a new chemical is sitting in our beaker) while at the micro level random actions are occurring within special setups. And that can be contrasted with the philosophical view which discerns copious examples of macro-scale nonrandom actions and assumes that this macro non-randomness must be operating “all the way down” to what is happening at the micro scale.

How can we better understand this transition from micro to macro scale?

It is sometimes argued that micro-scale random events (especially the probabilities of quantum mechanics) somehow “average out” to create macro-scale directed actions. But that misses how the average of everything moving randomly is a net zero, like Lake Tahoe staying in place, or like the heat death of the Universe. The average of a lot of randomness is more randomness, not a concerted action. To my thinking, to get from random micro actions to a macro-scale outcome, we need to factor in the role of the arrangement of the setup.

Another consideration is that some people understand “stochastic” as being a statistical treatment of a phenomenon that is made because the number of micro interactions is too great to account for all of them feasibly. The argument is that, since there are too many interactions to add up all of their influences, we supposedly resort to statistics for that reason (which can be used to make the additional argument that the actions still must be causal, even though we can’t keep track of them all).

But that is not what I am saying. I am proposing that the role of the arrangement of the setup is integral and indispensable and must be included in any explanation of what is happening.

The philosopher David Hume held, in the 1700’s, that every action has a cause. And accordingly, he argued that there is no such thing as what he called “chance” (randomness). His thinking has contributed greatly to the common intuition that, since causation is so easily observed in our daily lives, it must be the case all the way down to the micro level. But if we invoke a definition of “random” that is often used in science—that “random” means “exhibiting no patterns”—then there are all kinds of ways of formally showing that a phenomenon has no pattern (as is shown, for instance, with Brownian motion).

Then, by that definition, the above-mentioned micro-events are not causal. They are not directed in any fashion, either as in following laws or as in participating in a chain of causation. Instead, they are acting without any pattern.

The world is actually full of chance (because energy gives particles the ability to move randomly).

Today, however, many people—and certainly many philosophers—seem to share Hume’s intuition that causality is fundamental. Yet my point is that thermodynamics takes heat energy—randomness—as what is fundamental. And the two (randomness and causality) are not the same since causality is a type of pattern, and randomness is the absence of a pattern. The easiest way to see that is by examining the types of energy (which I am about to do). 

First, though, I should add that, of course, not all philosophers have followed Hume’s lead in this matter. For instance, the philosopher James Norton has argued that causality is too “plastic” a notion to be of use to science. If you can always make it work by changing how you formulate the issue, then that is hardly a good yardstick by which to measure what is right or wrong. (See my very first two posts, “Is Causality a Cluster Kind?”).

types of energy

So I want to conclude by looking closer at how thermodynamics does indeed take random action as fundamental, not causal action as fundamental. And we can do that by considering the types of energy.

Energy can take many forms, such as kinetic, potential, light, heat, chemical, and mechanical energy. For instance, heat energy is the energy of random motions. Mechanical energy is the energy imparted as one object pushes on another, and chemical energy is the energy captured in bonds holding atoms together. And one type of energy can be converted to another type.

The conversion of heat energy to mechanical energy can be illustrated by the steam engine. In the boiler of the steam engine, water is heated into steam which then builds up pressure. (The kinetic energy increases as the steam molecules acquire ever greater random velocities). Then the steam is allowed to escape the boiler through a hole onto a gear, which then turns. Thus the randomness of heat energy is converted into mechanical energy as now the turning gear can be used to propel other actions.

It is not necessary to start with pushing actions in order to end up with pushing actions. We can start with the randomness of steam within a special setup which converts it into pushing actions.

And once this mechanical energy is formed, it can spread from object to object via a succession of further pushing actions, creating how one event can lead to another in the Newtonian fashion that we witness. But the second law stipulates that no such pushing action can be 100% efficient, and some of the mechanical energy (the ability to push on things) will be lost in the form of escaping heat energy (random actions) during the process.

And likewise, other setups can convert one type of energy into the other forms of energy, each with its own qualities, relationships, and abilities. In this manner, still more complexity can be created.

Then we can look closer at Hume’s intuition that causality is fundamental. From the standpoint of studying energy, Hume’s position is like asserting that mechanical energy is fundamental, that things pushing on other things comes first and is the basis for everything else. Hume’s approach is today even called “mechanical philosophy” (by the philosophers who advocate it).

But thermodynamics takes heat energy (things randomly vibrating/rotating/translating) as what is fundamental (and as what gets turned into everything else) since everything has a temperature, and temperature is the average kinetic energy of all of a thing’s random actions. In other words, a particle moves because it has energy, internal to itself, not just move from being impacted mechanically from outside of itself. To credit only mechanical energy as making things animated is to miss a lot of what is going on in the world.

And there is another consideration.

Dickerson, in his textbook, makes a point of showing how the major terms of thermodynamics (enthalpy, entropy, internal energy) were deliberately defined in a manner so as to make them independent of “the path” of actual events (pp. 76-89). The “path” is the route by which a big change occurs—it is the accumulated moment-by-moment little changes in what is happening—so the path is where any cause-and-effect might exist. But that is what is being skipped over. All that we need to predict the outcome of a chemical reaction are the initial and final values of these quantities so as to subtract them to find the net change. We don’t need to know how they got from the initial to the final values. And that is good because this way the path can be random.

It is like how we can make predictions about a person’s weight change just by knowing the net consumption of calories and net amount of exercise without needing to know the moment-by-moment details of how one minor fluctuation in weight supposedly made happen the next fluctuation.

For reasons of space, I am not here going to go into all of the ways that causality can still be used profitably in a manner consistent with thermodynamics. But suffice it to say that there are many ways, but they are different from how the old Enlightenment philosophers described causality.


It is certainly easy to understand how people can look around at their world and find examples of causality everywhere and conclude that causality must extend “all the way down” to the micro level of events. And even more, it is easy to see the world strictly in terms of forces. But when we do that, we encounter the problem of how to account for the formation of complexity. How does complexity get made, if all that is happening is that the world is pushing around what it has already done? That is especially perplexing if we look only at the equations, and a few decades ago that question spawned the development of “complexity studies” such as at the Santa Fe Institute.

But if we see the world in terms of energy, then we can appreciate the role of arrangement in creating structures. And making structures explains how complexity can arise.

It is just that taking energy as fundamental also introduces a role for randomness in building that complexity, and that can seem contrary to mechanical thinking. For instance, some people want to explain away the probabilities in quantum mechanics.

But we don’t need to explain away the randomness in the world (although I agree randomness is inconsistent with mechanical thinking). What we should do is to see how that randomness contributes to making the world as it is capable of operating.

Then we can consider these two qualities of energy—how it enables particles to move randomly but also lets them stick together to make arrangements—and appreciate how that combines to create a world of complexity.

IMAGE from Wikipedia

One thought on “On the Importance of Randomness

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s