Vision
The Problem
Section titled “The Problem”Most people’s experience with LLMs and documents is stateless. RAG systems re-derive knowledge from scratch on every query. Ask a subtle question that requires synthesizing five documents, and the LLM has to find and piece together the relevant fragments every time. Nothing compounds.
The Solution
Section titled “The Solution”Instead of retrieving from raw documents at query time, the LLM incrementally builds and maintains a persistent wiki — a structured, interlinked collection of markdown files. When you add a new source, the LLM reads it, extracts key information, and integrates it into the existing wiki — updating entity pages, revising summaries, noting contradictions.
The wiki is a persistent, compounding artifact. Cross-references are already there. Contradictions have been flagged. Synthesis reflects everything you’ve read. It gets richer with every source and every question.
You never write the wiki yourself — the LLM writes and maintains all of it. You curate sources, explore, and ask the right questions.
Why This Works
Section titled “Why This Works”The tedious part of maintaining a knowledge base is the bookkeeping — cross-references, summaries, consistency. Humans abandon wikis because maintenance grows faster than value. LLMs don’t get bored and can touch 15 files in one pass. Maintenance cost drops to near zero.
Use Cases
Section titled “Use Cases”- Personal — goals, health, journal, self-improvement
- Research — deep-dive on a topic over weeks/months
- Reading a book — characters, themes, plot threads
- Business/team — internal wiki fed by Slack threads, meeting transcripts
- Cross-project knowledge — tech patterns, strategy playbooks that carry forward
What Makes LLM Wiki Different
Section titled “What Makes LLM Wiki Different”- Multi-vault with cross-pollination — multiple vaults with promote/reference flow for knowledge transfer
- One engine, many vaults — Claude Code skills are the centralized engine, vaults are just data
- Open source — clone it, use it, contribute to it
- Lazy dependencies — tools install themselves on first use
Inspiration
Section titled “Inspiration”- Karpathy’s LLM Wiki gist — the original idea file
- Karpathy’s viral tweet — 55K likes
- Vannevar Bush’s Memex (1945) — private, curated knowledge store with associative trails