Memory Distillation
Memory distillation improves the quality of your agent's knowledge over time by compacting redundant memories, extracting structured facts, and detecting embedded secrets.
Feature flag: HIPPOCORTEX_COMPILER_DISTILL (enabled by default)
Pipeline
- Cluster detection — groups similar memories via semantic similarity
- Compaction — merges near-duplicates into consolidated entries
- Fact extraction — LLM-powered extraction of key facts and summaries
- Pattern mining — identifies recurring task schemas, failure playbooks, decision policies, causal patterns
- Secret scanning — LLM catches secrets that inline regex missed
- NER piggybacking — extracts entities for the knowledge graph
When It Runs
- Auto-compile: after every 10 captured events (5-minute sweep)
- Hourly distillation: compiler queue processes all tenants
- Up to 200 memories per run per tenant
Configuration
| Flag | Default | Description |
|---|---|---|
HIPPOCORTEX_COMPILER_DISTILL | true | LLM distillation |
HIPPOCORTEX_PATTERN_MINING | false | Pattern mining |
HIPPOCORTEX_COMPILER_EXTRACTORS | true | Structured extraction |
HIPPOCORTEX_COMPILER_SECRET_SCAN | true | LLM secret detection |
HIPPOCORTEX_SEMANTIC_DEDUP | true | Semantic dedup |
HIPPOCORTEX_LLM_NER | true | LLM entity extraction |