---
id: "concept-wiki-staleness"
type: "concept"
source_timestamps: ["00:25:04"]
tags: ["knowledge-management", "system-failure"]
related: ["concept-ai-wiki", "concept-error-baking"]
definition: "The dangerous degradation of a knowledge base where pre-written AI summaries fall out of sync with new raw data, presenting outdated information as confident truth."
sources: ["s11-wiki-vs-open-brain"]
sourceVaultSlug: "s11-wiki-vs-open-brain"
originDay: 11
---
# Wiki Staleness (Drift)

# Wiki Staleness (Drift)

> The dangerous degradation of a knowledge base where pre-written AI summaries fall out of sync with new raw data, presenting outdated information as confident truth.

## What It Is

**Wiki Staleness**, or *drift*, happens when a pre-synthesized knowledge artifact (like an AI-generated markdown page in a [[concept-ai-wiki]]) falls out of sync with the underlying raw data.

## Why It's More Dangerous Than Missing Data

- In a database ([[concept-openbrain-architecture]]), missing data simply results in an *I don't know* or an incomplete query result.
- A stale wiki page is **actively dangerous** because it presents outdated synthesis as current, confident truth.

If new, contradictory information enters the system but the AI fails to properly update all downstream narrative pages, the user will read the wiki and act on false confidence. The artifact reads like it knows what it's talking about, masking the fact that its foundational premises have shifted.

## Mitigation

The [[concept-hybrid-memory-architecture]] treats wiki pages as disposable — if drift is detected, the page is regenerated from the pristine database. Closely related to [[concept-error-baking]].


## Related across days
- [[concept-error-baking]]
- [[concept-silent-contradictions]]
- [[concept-context-rot]]
- [[arc-silent-failure-taxonomy]]
