Few concepts in physics have been so persistently misunderstood — and so stubbornly defended in their misunderstanding — as entropy.
Outside technical contexts, entropy is almost universally glossed as disorder.
This imagery is vivid, intuitive, and almost entirely misleading.
In this post, we remove the metaphor of disorder altogether and replace it with a relational account that does actual explanatory work.
1. Why “disorder” feels right — and why it fails
The metaphor of disorder persists because it appears to match experience.
But appearances conceal the real issue.
What changes in these cases is not the amount of structure, but the number of structurally compatible continuations.
A shattered glass is not less structured than an intact one. It is structured differently — and in vastly more ways.
Entropy is not that judgement.
2. Entropy as a count of availability
Stripped of metaphor, entropy is a statistical measure.
More precisely, it is a measure of how many micro-configurations are compatible with a given macro-description.
From the perspective of the relational ontology, this can be stated more directly:
Entropy measures the density of relationally compatible re-actualisations of a construal.
High-entropy configurations are those that can be re-cut in many ways without violating the constraints that define them.
Low-entropy configurations are those that can be re-cut only in very specific ways.
Nothing here implies mess, decay, or failure.
It implies availability.
3. Why high entropy looks “boring”
Another common intuition is that high-entropy states are dull or featureless.
But this boredom is a projection of our interests, not a feature of the configuration.
They can be realised in innumerable ways that all satisfy the same coarse description.
Their apparent sameness is a triumph of relational compatibility, not a loss of structure.
4. Why entropy tends to increase (without wanting to)
We can now restate the second law with far greater precision.
Entropy does not increase because systems strive for disorder, nor because time pushes them forward.
Entropy increases because, across successive construals, configurations with higher relational availability:
can be re-actualised more easily
can absorb perturbations without collapse
can be continued in many more ways
When construal continues, these configurations overwhelmingly dominate.
5. The asymmetry hidden by the disorder metaphor
The metaphor of disorder hides the most important asymmetry in thermodynamics.
configurations with many nearby continuations
configurations with almost none
Once a system occupies a region of high relational availability, returning to a highly specific configuration requires coordinating an enormous number of constraints simultaneously.
Entropy, properly understood, names this asymmetry.
6. A relational restatement
We can now restate the core claim cleanly:
Entropy is not disorder.
Entropy is not decay.
Entropy is not loss.
Entropy is a measure of how many ways a configuration can keep being re-actualised.
Seen this way, the second law does not describe the world running down.
It describes the world overwhelmingly occupying regions where continuation is cheap.
7. Looking ahead
If entropy is relational availability, then the familiar arrow of time cannot be a temporal arrow at all.
It must instead be an asymmetry in the space of constraints.
In the next post, we will show precisely how the arrow of time emerges as a gradient of re-actualisation cost, without invoking flow, direction, or temporal metaphysics.
For now, we can let the most misleading metaphor of thermodynamics quietly dissolve.
What remains is not disorder, but abundance.
No comments:
Post a Comment