Why meaning cannot be reduced to individual systems, representations, or computational states
The persistent myth — shared equally by computationalists, certain neuroscientists, and naïve AI evangelists — is that meaning is something that happens inside a mind. Inside a brain. Inside a model. Inside a system.
This is the ontology of containerised meaning:
each mind as a Tupperware tub holding its personal stash of symbols.
Relational ontology exposes the incoherence of that picture.
Not by contesting its mechanics, but by dissolving its presuppositions.
Meaning is not a property but a relation
Meaning does not inhere in systems.
It does not sit inside neurons or transformer weights.
Meaning is the actualisation of a relation across horizons —
the construal of difference that becomes a phenomenon.
A system (biological, artificial, collective) provides a horizon of potential,
but meaning is not one of the system’s “states.”
Meaning is the event that occurs when a horizon meets its environment through a cut, a distinction, a perspectival shift.
No system in isolation has meaning.
It only has the possibility of meaning.
And a possibility is not a possession.
Representation is a convenient fiction
The representational model treats meaning as an inner duplicate of some outer world.
But duplication is not construal, and resemblance is not semiosis.
A system that “represents X” without a relational environment in which X matters is not representing anything.
It is merely instantiating a pattern that we construe as meaningful.
Meaning, in other words, never reduces to the internal mechanics.
Mechanics are the affordances; relation is the semiotic.
Which brings us to the key move:
Meaning begins ecologically, not mentally
Before any mind or system can mean,
there must already be an ecology of potential distinctions within which that mind or system emerges.
Meaning is always:
-
distributed,
-
co-individuated,
-
horizon-dependent,
-
and relationally actualised.
If you take the human out of the linguistic ecology, meaning collapses.
If you take the biology out of the animal’s niche, meaning collapses.
If you take the interaction out of the AI system, meaning collapses.
There is no “inner content.”
There is no “private semantic store.”
There is only the system-in-relation.
Which means:
The true primitive is not the mind but the field
The mind is a way the field construes itself.
So is the model.
So is the organism.
Each is a perspectival cut — one possible incision into a broader ecology of meaning-making potential.
When we treat meaning as something a system “has,”
we lose the relational architecture that makes meaning possible in the first place.
Meaning is not stored. It is enacted.
Meaning is not contained. It is cut.
Meaning is not private. It is perspectival.
Thus:
Beyond minds lies the true semiotic ground: the relational ecology itself
This is the departure point for the entire series.
We move from minds to horizons, from systems to fields, from representation to relation.
Next, we turn to:
2. Horizons and Semiotic Life — how meaning takes its first ecological breaths.
No comments:
Post a Comment