The Spark and the Dry Forest

When people hear “military AI,” they picture something cinematic.

Autonomous drones.
Rogue targeting systems.
A machine deciding who lives and who dies.
The red glow of a control panel somewhere beyond human reach.

We have been rehearsing these images for decades. Science fiction has done the emotional preparation. When political rhetoric sounds like “move faster” or “we cannot afford restraint,” it lands against a cultural memory of world wars and nuclear brinkmanship.

Military AI hits harder because it is visceral.

It concentrates everything frightening about technology into one domain:

  • lethal force

  • secrecy

  • speed

  • escalation

It feels like the obvious danger.

And it is dangerous.

But it is not the deepest risk.

The Visible Spark

Weapons are easy to imagine because they are discrete.

They have:

  • identifiable operators,

  • clear chains of command,

  • visible consequences.

If an autonomous system misfires, we can point to it.
If escalation occurs, we can trace the chain.

A spark is dramatic.
It is sudden.
It commands attention.

That is why public anxiety gravitates toward the military domain first.

But a spark only becomes catastrophic in the presence of fuel.

The Dry Forest

The deeper transformation is quieter.

AI is not only entering weapons systems. It is entering everything:

  • financial markets

  • public administration

  • logistics chains

  • intelligence analysis

  • welfare allocation

  • hiring systems

  • information filtering

  • scientific modelling

It is becoming infrastructure.

Not an event — an environment.

This is what makes the risk harder to perceive.

There is no explosion when:

  • judgement becomes automated,

  • optimisation replaces deliberation,

  • perception is mediated at scale,

  • epistemic authority shifts from human institutions to probabilistic systems.

There is drift.

And drift accumulates.

The danger is not that one system goes rogue.

The danger is that human decision-making becomes increasingly outsourced, gradually, invisibly, across sectors that collectively define how society functions.

When that happens, failures are no longer local.

They become systemic.

Hard to attribute.
Hard to reverse.
Hard even to notice until coordination itself begins to fail.

That is the dry forest.

Why the Spark Distracts Us

Humans evolved to respond to immediate threat.

We are good at:

  • reacting to explosions,

  • identifying enemies,

  • recognising visible danger.

We are not good at:

  • detecting structural drift,

  • responding to diffuse erosion,

  • organising around abstract systemic risk.

So we fixate on the spark.

It feels actionable.

But focusing exclusively on military AI can obscure the deeper layer that makes escalation more likely in the first place.

If shared reality weakens, if institutions fragment, if epistemic cohesion erodes,
if trust collapses, then the system in which military AI operates is already unstable.

A spark in a damp forest fizzles.

A spark in a dry one spreads.

The Acceleration Problem

There is another layer.

Military acceleration is often justified through competition:

  • “If we slow down, others will win.”

  • “We cannot afford delay.”

  • “Restraint is weakness.”

This language is not new. It has preceded major inflection points throughout history.

But what it misses is this:

Speed does not only accelerate weapons development.
It accelerates deployment everywhere.

When AI is normalised rapidly across:

  • governance,

  • economy,

  • infrastructure,

  • information systems,

we reduce the capacity to pause, assess and coordinate restraint.

The more coupled the system becomes, the less room there is to interrupt it.

The forest dries as it scales.

The Real Question

Military AI is frightening because it compresses risk into a single image.

But the more destabilising possibility is slower:

What happens if human judgement is gradually replaced by machine-optimised proxies across the systems that shape reality?

What happens when:

  • perception is filtered algorithmically,

  • policy is guided by opaque optimisation,

  • authority becomes statistical rather than deliberative?

The greatest risk of AI is not that it will decide to kill us.

It is that we will gradually lose the capacity to decide at all.

When that capacity erodes, military restraint erodes with it.

The spark becomes harder to contain because the forest has already dried.

What We Are Waiting For

Public fear will likely crystallise around a visible failure, an escalation, a misfire, a scandal.

But the deeper transformation may never produce a single dramatic rupture.

It will produce normalisation.

And normalisation is harder to reverse than catastrophe.

Because catastrophe clarifies.

Drift disguises.

Military AI is the spark.

Civilisational AI coupling is the dry forest.

If we focus only on the spark, we may miss the slow conditions that make it catastrophic.

And by the time we notice the forest, it may already be burning.



Previous
Previous

Five Weeks Between Peace and War

Next
Next

The Archaeology of Harm