Chapter 24
What is Entropy?

Entropy is a complex and abstract concept, yet it is incredibly powerful. One of its commonest definitions or meanings is that it is a measure of disorder or randomness. There are several other mathematical or statistical definitions, but we need not discuss them here. It is well accepted though that as the universe ages entropy increases overall, but there are exceptions to this rule. One of the commonest examples we know of is that of life, for life always tends to make organised systems out of the surrounding chaos. Ultimately however, life is trivial and only local in the vast scheme of the universe.

Many scientists predict that all order and design will eventually be over-ruled, as the universe submits to its ultimate state, which has been termed “Heat Death”. This happens when all things are finally separated and there is no energy left to change anything. Entropy will then be at a maximum.

A common question is: does energy influence entropy and what is their relationship?

We could say that entropy is more. More movement, more disorder, more randomness in structure. But does energy fuel that state of change? It seems likely that it does, for energy is used to separate the particles. And this energy can run down. Eventually.

There is an equation for entropy, and this involves the number of possible configurations available. More states means more entropy. But can entropy be stored? Creating a kind of potential entropy.

This might be useful.

The Origin of Everything
(Online Edition)