Let’s unpack the concept of entropy, which is an abstract term loosely connected to the idea of randomness. The term “entropy” appears in different fields of study with different meanings.
In thermodynamics and statistical mechanics, entropy describes the disorder or dispersion of energy in a system. In information theory, entropy measures the uncertainty of information in a message or data set; it is the ambiguity of information. In finance, entropy can be used to quantify the amount of randomness in the movement of securities’ prices or the unpredictability of performance.
What’s common to all these applications is the idea that chaos exists in otherwise orderly systems. As we’ll discuss shortly, the process of creating singular links often involves harnessing chaos and reducing entropy.
For a physical example of entropy, consider the internal combustion engine of the Toyota Corolla, which Wikipedia says is the world’s best-selling car. When you start your Corolla, gasoline is burned to drive the pistons, but some of the energy is inevitably lost as heat, sound, and friction as the energy is transferred to other mechanical components of the car. These losses represent an increase in entropy because energy is being dispersed rather than being used efficiently. The heat produced as a by-product of the engine’s mechanical process must be constantly dissipated to prevent overheating.
Energy is not directly related to the amount of energy that is unavailable for conversion into mechanical energy. Entropy doesn’t tell you, for instance, that for every gallon of gas consumed, 5% of the potential mechanical energy is lost in the burning process. Rather, entropy is a measure of the dispersal of energy within a system. It tells you how far-flung the lost energy becomes when it is lost.
Think of it in the context of a school field trip where 20 kids visit a museum, and one of the kids wanders away from the group. Entropy doesn’t tell you that one of the 20 kids got lost. Entropy tells you how far that one kid traveled while separated from the group. The entropy might be small, such as if the kid remained in the same museum but ended up in a different room. Or the entropy could be large, such as if the kid climbed on a Greyhound bus and traveled across the country. For the adults responsible for keeping track of the kids on the field trip, the difference between small entropy and large entropy could equate to the difference between gainful employment and unemployment.
The laws of physics show that systems, processes, and interactions tend to evolve towards states of greater disorder and randomness over time. This concept is succinctly captured in the second law of thermodynamics, which states that in an isolated system, entropy tends to increase over time.
Heat disperses as it flows from higher-temperature objects to lower-temperature objects, scattering its energy in the cooling process. The molecules in frozen water are highly ordered, but as water melts, the molecules become more disordered and move more freely. When you mix salt with water, the ions contained in the salt become more randomly dispersed in the solution than they were in the solid state. When a biological life dies, natural decomposition sets in, causing its particles to scatter. As the saying goes, “The only constant in life is change.”
So, if the natural tendency of the world is towards entropy, the reduction of entropy requires activity. The historian Henry Adams said, “Chaos is the law of nature; order is the dream of man.” Much of human activity focuses on bringing order to an otherwise chaotic world.
People who show up to a job interview looking tidy, with hair groomed, and wearing a pressed outfit, communicate the message, “You can count on me as someone well practiced in the skills of entropy reduction.” They might as well wear a sign that says, “Hire me: I create order from chaos.”
In the next post, we’ll look at the relationship between entropy and singular links.
1 thought on “From Entropy to Order”
Comments are closed.