Monday, 26 January 2026

Thermodynamics Without Time: 2 Entropy Is Relational Availability, Not Disorder

Few concepts in physics have been so persistently misunderstood — and so stubbornly defended in their misunderstanding — as entropy.

Outside technical contexts, entropy is almost universally glossed as disorder.

Rooms become messy.
Structures fall apart.
Things run down.

This imagery is vivid, intuitive, and almost entirely misleading.

In this post, we remove the metaphor of disorder altogether and replace it with a relational account that does actual explanatory work.


1. Why “disorder” feels right — and why it fails

The metaphor of disorder persists because it appears to match experience.

Broken objects look less organised than intact ones.
Mixed gases look messier than separated ones.
A shuffled deck looks less structured than a sorted one.

But appearances conceal the real issue.

What changes in these cases is not the amount of structure, but the number of structurally compatible continuations.

A shattered glass is not less structured than an intact one. It is structured differently — and in vastly more ways.

Disorder is not a physical property.
It is a judgement made from a particular construal.

Entropy is not that judgement.


2. Entropy as a count of availability

Stripped of metaphor, entropy is a statistical measure.

More precisely, it is a measure of how many micro-configurations are compatible with a given macro-description.

From the perspective of the relational ontology, this can be stated more directly:

Entropy measures the density of relationally compatible re-actualisations of a construal.

High-entropy configurations are those that can be re-cut in many ways without violating the constraints that define them.

Low-entropy configurations are those that can be re-cut only in very specific ways.

Nothing here implies mess, decay, or failure.

It implies availability.


3. Why high entropy looks “boring”

Another common intuition is that high-entropy states are dull or featureless.

A uniform gas seems less interesting than a carefully prepared gradient.
A thermal equilibrium looks uneventful.

But this boredom is a projection of our interests, not a feature of the configuration.

High-entropy states are not simple.
They are redundant.

They can be realised in innumerable ways that all satisfy the same coarse description.

Their apparent sameness is a triumph of relational compatibility, not a loss of structure.


4. Why entropy tends to increase (without wanting to)

We can now restate the second law with far greater precision.

Entropy does not increase because systems strive for disorder, nor because time pushes them forward.

Entropy increases because, across successive construals, configurations with higher relational availability:

  • can be re-actualised more easily

  • can absorb perturbations without collapse

  • can be continued in many more ways

When construal continues, these configurations overwhelmingly dominate.

This is not a tendency imposed on the world.
It is a statistical fact about compatibility spaces.


5. The asymmetry hidden by the disorder metaphor

The metaphor of disorder hides the most important asymmetry in thermodynamics.

The asymmetry is not between order and chaos.
It is between:

  • configurations with many nearby continuations

  • configurations with almost none

Once a system occupies a region of high relational availability, returning to a highly specific configuration requires coordinating an enormous number of constraints simultaneously.

Nothing forbids this.
Almost nothing supports it.

Entropy, properly understood, names this asymmetry.


6. A relational restatement

We can now restate the core claim cleanly:

  • Entropy is not disorder.

  • Entropy is not decay.

  • Entropy is not loss.

Entropy is a measure of how many ways a configuration can keep being re-actualised.

Seen this way, the second law does not describe the world running down.

It describes the world overwhelmingly occupying regions where continuation is cheap.


7. Looking ahead

If entropy is relational availability, then the familiar arrow of time cannot be a temporal arrow at all.

It must instead be an asymmetry in the space of constraints.

In the next post, we will show precisely how the arrow of time emerges as a gradient of re-actualisation cost, without invoking flow, direction, or temporal metaphysics.

For now, we can let the most misleading metaphor of thermodynamics quietly dissolve.

What remains is not disorder, but abundance.

No comments:

Post a Comment