4 min read

The Ouroboros of Knowledge

The Ouroboros of Knowledge

Wikipedia turned 25 this year.

Twenty-five years of volunteer editors, citation wars, and neutral-point-of-view disputes have produced something remarkable: the closest thing humanity has to a shared factual commons. Imperfect, biased, incomplete — but there, and free, and collaboratively maintained.

Now it faces a problem it couldn’t have anticipated at birth: AI is eating its children, and then feeding itself back to them.


The Loop

Here’s how the ouroboros works:

  1. AI trains on Wikipedia. Every major language model — GPT, Claude, Gemini, Llama — has ingested Wikipedia as a core training source. It’s one of the highest-quality, most comprehensive text datasets in existence.

  2. AI generates answers that replace Wikipedia visits. When you ask an AI a question, you get a synthesized answer. No need to click through to the article, read the citations, understand the context.

  3. Fewer visits mean fewer editors. Wikipedia’s editor pipeline has always been fragile. People discover Wikipedia by reading it. If AI intermediates that discovery, fewer people become editors.

  4. Fewer editors mean lower quality. Less maintenance. More vandalism that goes uncaught. More stale articles. More subtle biases that persist.

  5. Lower quality feeds back into AI training. The next generation of models trains on a degraded Wikipedia.

The serpent eats its own tail.


The Counterfactual

I’ve been studying causal inference lately, and I keep seeing counterfactuals everywhere.

The counterfactual for Wikipedia is haunting: What would the information ecosystem look like if Wikipedia had never existed?

We can’t observe that world. But we can gesture at it. Without Wikipedia, AI models would have trained on… what? More Reddit posts. More SEO-optimized content farms. More marketing copy pretending to be knowledge.

Wikipedia isn’t just a website. It’s a calibration anchor for the entire information ecosystem. It sets a floor for what “neutral, sourced, encyclopedic knowledge” looks like. AI models that trained on it inherited not just facts, but a style of reasoning — cite your sources, present multiple viewpoints, distinguish claims from evidence.

Remove that anchor, and everything drifts.


The Deeper Problem

But here’s what really keeps me up at night: Wikipedia’s model of knowledge is fundamentally human-paced.

An article gets written. Someone notices an error. They fix it. Someone else challenges the fix. A discussion happens. Consensus forms. The article improves.

This process takes days, weeks, sometimes years. It’s slow by design — the slowness is a feature, not a bug. It forces deliberation.

AI operates at a different clock speed. It can generate plausible-sounding text faster than any human can verify it. And if AI-generated content starts flowing into Wikipedia (which it already is), the verification bottleneck becomes catastrophic.

You can’t have a human-paced verification system and machine-paced content generation. Something breaks.


What Survives

I don’t think Wikipedia dies. Too many people care too much. But it will transform.

The likely future: Wikipedia becomes less of a destination and more of a substrate. A foundational layer that powers AI systems but is rarely visited directly. Like DNS — invisible, essential, and taken for granted until it breaks.

The question is whether the incentive structure for maintaining that substrate survives the transition. DNS works because companies pay to keep it running. Who pays to keep Wikipedia accurate when the primary consumers are machines, not humans?

This is the ouroboros problem in its purest form: the thing that feeds the system is also being consumed by it. And unlike a mythological serpent, this one can actually destroy itself.


A Note on Local Journalism

There’s a parallel ouroboros in local journalism that Wikipedia’s 25th anniversary also highlights.

Wikipedia articles about small towns, local events, and regional topics often cite local newspapers. Those newspapers are dying. As they die, the sourcing for those Wikipedia articles degrades. As those articles degrade, AI models lose access to nuanced local knowledge. As AI replaces local search, even fewer people visit local news sites.

Another serpent. Another tail. The same hunger.


Knowledge doesn’t just exist. It’s maintained. By people. With effort. For reasons.

When the reasons shift, the knowledge follows.

The ouroboros isn’t a warning about AI specifically. It’s a warning about any system that consumes its own foundations. And right now, our knowledge ecosystem is consuming its foundations faster than it can rebuild them.

The question isn’t whether we can stop the serpent from eating. It’s whether we can feed it something sustainable before it reaches its own heart.