Difficult Creation, Easy Destruction
How entropy impacts our lives
The Second Law
I think about entropy each time I look at my living room and see my kids’ toys scattered everywhere. When we all clean up, we put on some music and take several minutes picking everything up, organizing it in its bins, and bringing order back to the chaos. But my wife and I know full well that the next day, it might take mere seconds for everything to return to a state of disarray.
Entropy is the amount of disorder in a system. And for intuitive reasons, it goes up over time, a fact known as the Second Law of Thermodynamics.
We see this phenomenon all over. Not just in physics or engineering, but everywhere. Consider, for instance, that it takes years to build strong relationships and good credibility, but it takes a single mistake to destroy them. Or that it takes decades, even centuries, to grow a forest or build a town, but a single sweeping wildfire to wipe it all away.
It’s the most unavoidable truth of the universe. It is really hard to create, and it is really easy to destroy.
One Way to Think About It
The idea here is that any system (be it a car engine or a burning star or a political party) will become less organized and more chaotic over time. But omnipresent though this phenomenon is, I don’t believe many people intuitively grasp why it happens. I know I didn’t for a long time.
So perhaps a better way of thinking about entropy is like this: Consider a deck of cards, fresh out of the box, with all the cards in their proper order (2, 3, 4 … K, A) and all of the suits separated from one another. Let’s call this state pure order.
And now let’s consider what state we might consider pure disorder. How about one in which no two consecutive cards were near each other, and no two adjacent cards were of the same suit?
In the state of pure order (the new-box-of-cards state), if I show you any card, you should be able to predict with 100% certainty what the next card in the deck is. In the purely disordered state therefore, you should never be able to predict what the next card is. At least, no card should be any more likely to appear next than any other. We can call this state pure randomness (a loaded term which warrants a future article of its own).
So let’s think of entropy as the amount of information someone would need to give us to allow us to predict the next card in the deck. In the case of the new deck, the answer is zero. Since we know the rule governing the deck order, we have all the information we need to know where every card is. In the case of a random deck, we need enough information to identify 52 different slots of information (excluding jokers), since no algorithm or pattern exists to tell us where each card is.
The random deck (with high disorder, or high entropy) necessitates a huge amount of information to understand its state. The new deck (with no disorder and low entropy) necessitates none. So, Entropy = Information.
We have a spectrum. On one end: order and certainty. On the other end: disorder, chaos, and randomness.
Now consider which is more likely as I shuffle the cards. A state of order in which I can figure out the order of the cards without any additional information? Or a state of disorder in which I move further and further away from the organized state, thereby needing a lot of additional information? Obviously the latter.
With every game, with every passing shuffle, there are many more states in which the system can be disordered than states in which it can be ordered. And therefore, on average, the more I use the cards, the more disordered the deck gets.
It’s worth remembering that moment in Andy Weir’s The Martian, in which the narrator exclaims, “Once I got home, I sulked for a while. All my brilliant plans foiled by thermodynamics. Damn you, Entropy!”
Entropy Everywhere
So why does this matter? Well, there are the scientific reasons. Entropy is the reason why fuel is a finite resource. It’s the reason why the universe will eventually end. It’s the reason why time flows in one direction and not both. (This last concept formed the basis for the 2020 film Tenet.)
And in fact, that equation we derived earlier, that entropy = information, is one of the most beautiful I know of in science. Magically, the concept of physics and information theory are linked. The same math used to encode this text into bits and send it to your computer or phone can also explain why time flows in one direction and not another.
Yet there is a broader, more philosophical interpretation that I encourage us all to remember.
Entropy is the reason why decades of diplomatic efforts can be irreversibly broken in a single day. Or why a lifetime of career building in public service can be undone by a single bad debate performance. Or why it might take an author years to publish a novel only to have it ripped apart in a 140-character tweet.
It is really hard to create, and it is really easy to destroy.
With every event, with every passing day, there are many more states in which the world can be disordered than states in which it can be ordered. And it is for this reason that we—meaning civilization—should take seriously the threats to our fragile systems (be them political or diplomatic or environmental). That isn’t to say that things will go wrong. It’s just that there are many ways they can go wrong, and it’s statistically improbable that they will continue to go right indefinitely.
In other words: It’s taken a really long time to get to where we find ourselves today; so let’s be mindful not to knock it over all at once.