Memory Distillation

Memory distillation improves the quality of your agent's knowledge over time by compacting redundant memories, extracting structured facts, and detecting embedded secrets.

Feature flag: HIPPOCORTEX_COMPILER_DISTILL (enabled by default)

Pipeline

  1. Cluster detection — groups similar memories via semantic similarity
  2. Compaction — merges near-duplicates into consolidated entries
  3. Fact extraction — LLM-powered extraction of key facts and summaries
  4. Pattern mining — identifies recurring task schemas, failure playbooks, decision policies, causal patterns
  5. Secret scanning — LLM catches secrets that inline regex missed
  6. NER piggybacking — extracts entities for the knowledge graph

When It Runs

  • Auto-compile: after every 10 captured events (5-minute sweep)
  • Hourly distillation: compiler queue processes all tenants
  • Up to 200 memories per run per tenant

Configuration

FlagDefaultDescription
HIPPOCORTEX_COMPILER_DISTILLtrueLLM distillation
HIPPOCORTEX_PATTERN_MININGfalsePattern mining
HIPPOCORTEX_COMPILER_EXTRACTORStrueStructured extraction
HIPPOCORTEX_COMPILER_SECRET_SCANtrueLLM secret detection
HIPPOCORTEX_SEMANTIC_DEDUPtrueSemantic dedup
HIPPOCORTEX_LLM_NERtrueLLM entity extraction