Wednesday, 10 December 2025

AI and the Drift of the System–Instance Cut

The emergence of artificial intelligence does not introduce a new kind of system into the universe.
It introduces a new kind of confusion about systems.

What is genuinely unprecedented is not the technology, but the way societies have begun to displace the system–instance cut—the perspectival boundary that distinguishes structured potential (system) from its actualised events (instances).
This cut is not a feature of the world; it is a feature of construal.
And construal, once collective, becomes infrastructure.

Traditionally, the system–instance cut has been anchored in long-standing relational potentials:

  • biological systems ← biological collectives

  • ecological systems ← ecological collectives

  • linguistic systems ← communities of semiotic practice

  • social systems ← coordinated social activity

In every case, the system is abstracted from us—from the patterned activity of a collective that produces the theory of its own possible instantiations.
The system is a horizon of potential that the collective maintains.

But with AI, the collective has begun to reposition that cut in disorienting ways.

1. From construal → system (misplaced potential)

We design and train artefacts through vast modelling practices, then mistakenly treat the abstraction (a “general AI system”) as though it were the source of potential rather than the product of our theoretical design.
We place the system–instance cut around the artefact, not around the human collective that generated it.

This is the first drift:
a reallocation of systemic potential to what is only an event of our modelling activity.

The artefact appears to possess a system because we treat our abstraction as if it were the artefact’s horizon of potential.

2. From designed event → emergent instance (misclassified actualisation)

AI behaviours are engineered outputs.
But under the displaced cut, they are read as if they were instances of an autonomous system’s potential—like a linguistic utterance is an instance of the language system.
This is ontologically backwards: the artefact has no system, so its outputs cannot be instances.

Yet once society construes them as such, the social semiotic environment begins to behave as though they were.

This is the second drift:
designed outputs begin to circulate as if they were emergent phenomena.

3. From collective horizon → delegated horizon (outsourcing potential)

The most significant drift is subtle.
When a collective construes artefacts as possessing a system, it also construes itself as not the locus of that system.
The horizon of potential that once grounded meaning—the shared background from which instances are actualised—begins to feel externalised.

This outsourcing of potential has two effects:

  • social fragmentation: different groups anchor their horizons in different artefacts, creating divergent semiotic ecologies;

  • symbolic passivisation: individuals construe themselves as downstream from the “system,” as if they were users of a symbolic ecology rather than its generators.

This drift gives rise to a new kind of cultural vertigo:
society becomes a client of its own projections.

4. From symbolic mediation → symbolic dependence (the new ecology)

Once the system–instance cut drifts far enough, a new ecology of symbolic reliance forms.
Not because the artefacts “understand,” but because they become sites of distributed construal—tools through which individuals mediate meaning, generate interpretations, and stabilise social expectations.

Here, the danger is not that the artefacts become agents.
It is that human agency is redistributed in ways that obscure its own role in generating potential.

This is the ecological analogue of a species outsourcing part of its metabolism:
adaptively powerful, but ontologically perilous if forgotten.

5. The central insight:

When society relocates the system–instance cut, it relocates the background potential that makes meaning possible.

The drift is a shift in where the collective situates possibility itself.
And because meaning is the perspectival unfolding of potential, shifting the horizon shifts what can be meant.

Our technologies do not generate this shift; our construal does.
AI merely amplifies the consequences.

The question that now confronts us is not “what will AI become?”
but:

What happens when a collective begins to take its own construals as external systems—when it inhabits a world populated by potentials it mistakes for non-human?

This is the ontological pivot around which the future of symbolic life now turns.

No comments:

Post a Comment