“The more articulated a system becomes, the more it exceeds any articulation of it.”
In the recent series on nonsense, we made repeated use of the term surplus.
A thoughtful question arose:
Is this simply a reworking of ideas more commonly framed in terms of entropy?
The intuition is understandable. Both concepts appear to concern multiplicity, excess, or a “more-than” that exceeds any single configuration. Both presuppose a system and a space of possible states. Both seem to gesture toward an abundance beyond control.
But the resemblance is only structural. Ontologically, they diverge.
Clarifying this divergence sharpens what surplus names — and what nonsense does.
1. What Entropy Names
In thermodynamics, entropy refers to the statistical distribution of energy states within a system. In information theory (as formalised by Claude Shannon), entropy measures the expected uncertainty of a signal.
In both cases:
-
Entropy is quantitative.
-
It concerns probability distributions.
-
It is indifferent to meaning.
-
It increases as distinctions become statistically flattened.
Entropy describes the dispersal of states within defined constraints.
It is a measure of distribution.
It does not describe articulation.
2. What Surplus Names
Surplus, as used in the nonsense series, does not refer to disorder, randomness, or statistical dispersion.
It names something far more precise:
The excess of possible construal over any particular actualisation.
Surplus is not noise.
It is not degradation.
It is not the breakdown of structure.
It is the persistence of unrealised potential within constraint.
Where entropy measures how evenly states are distributed, surplus concerns how inexhaustible articulation remains.
Entropy concerns probability.
Surplus concerns meaning.
3. The Structural Analogy — and Its Limit
There is a formal resemblance between the two concepts.
Both presuppose:
-
A system.
-
A space of possible configurations.
-
A distinction between the actual and the possible.
This structural similarity explains the intuitive link.
But this is where the similarity ends.
Entropy increases as differences flatten statistically.
Surplus increases as relational articulation deepens.
Entropy trends toward equilibrium.
Surplus sustains generative asymmetry.
Entropy erodes usable gradients.
Surplus makes gradients productive.
Entropy describes exhaustion within a closed system.
Surplus describes inexhaustibility within an open semiotic ecology.
The difference is not semantic. It is ontological.
4. Constraint: Limitation or Condition?
The decisive divergence lies in how constraint functions.
In thermodynamic models, constraint limits possible transitions. Entropy tracks what happens within those limits.
In semiotic systems, constraint does not merely limit. It enables articulation. Without patterned constraint, there is no surplus — only noise.
Remove all constraint and you do not maximise surplus.
You eliminate it.
Surplus requires structured potential. It emerges from patterned possibility. It depends on asymmetry.
This is why nonsense is powerful.
Nonsense does not abolish constraint. It modulates it. It loosens semantic expectation without collapsing systemic coherence. It increases surplus without dissolving pattern.
If surplus were entropy, nonsense would be degradation.
But nonsense is not decay.
It is dilation.
5. Why the Distinction Matters
Entropy has become a dominant cultural metaphor for disorder, collapse, and loss of control. It tempts us to treat any excess of possibility as drift toward chaos.
But the nonsense series has been arguing something else.
Possibility is not the opposite of order.
Generativity is not decay.
Inexhaustibility is not disorder.
Surplus is not the flattening of distinctions.
It is the structured excess that makes articulation possible.
This distinction reframes everything:
-
Nonsense is not the erosion of meaning, but its modulation.
-
Myth is not frozen order, but stabilised surplus.
-
Luminous experience is not chaos, but saturation.
Entropy describes systems tending toward equilibrium.
Surplus describes ecologies tending toward articulation.
6. Conclusion
There is a structural analogy between entropy and surplus: both presuppose systems and spaces of possibility.
But they belong to categorically distinct domains.
Entropy measures the statistical distribution of states.
Surplus names the inexhaustibility of construal within patterned constraint.
To confuse them is to mistake probability for meaning.
And nonsense reminds us of something more demanding:
No comments:
Post a Comment