Rust core: - Cap'n Proto append-only storage (nodes + relations) - Graph algorithms: clustering coefficient, community detection, schema fit, small-world metrics, interference detection - BM25 text similarity with Porter stemming - Spaced repetition replay queue - Commands: search, init, health, status, graph, categorize, link-add, link-impact, decay, consolidate-session, etc. Python scripts: - Episodic digest pipeline: daily/weekly/monthly-digest.py - retroactive-digest.py for backfilling - consolidation-agents.py: 3 parallel Sonnet agents - apply-consolidation.py: structured action extraction + apply - digest-link-parser.py: extract ~400 explicit links from digests - content-promotion-agent.py: promote episodic obs to semantic files - bulk-categorize.py: categorize all nodes via single Sonnet call - consolidation-loop.py: multi-round automated consolidation Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
38 lines
1.5 KiB
Markdown
38 lines
1.5 KiB
Markdown
# Consolidation Agent Prompts
|
|
|
|
Five Sonnet agents, each mapping to a biological memory consolidation process.
|
|
Run during "sleep" (dream sessions) or on-demand via `poc-memory consolidate-batch`.
|
|
|
|
## Agent roles
|
|
|
|
| Agent | Biological analog | Job |
|
|
|-------|------------------|-----|
|
|
| replay | Hippocampal replay + schema assimilation | Review priority nodes, propose integration |
|
|
| linker | Relational binding (hippocampal CA1) | Extract relations from episodes, cross-link |
|
|
| separator | Pattern separation (dentate gyrus) | Resolve interfering memory pairs |
|
|
| transfer | CLS (hippocampal → cortical transfer) | Compress episodes into semantic summaries |
|
|
| health | Synaptic homeostasis (SHY/Tononi) | Audit graph health, flag structural issues |
|
|
|
|
## Invocation
|
|
|
|
Each prompt is a template. The harness (`poc-memory consolidate-batch`) fills in
|
|
the data sections with actual node content, graph metrics, and neighbor lists.
|
|
|
|
## Output format
|
|
|
|
All agents output structured actions, one per line:
|
|
|
|
```
|
|
LINK source_key target_key [strength]
|
|
CATEGORIZE key category
|
|
COMPRESS key "one-sentence summary"
|
|
EXTRACT key topic_file.md section_name
|
|
CONFLICT key1 key2 "description"
|
|
DIFFERENTIATE key1 key2 "what makes them distinct"
|
|
MERGE key1 key2 "merged summary"
|
|
DIGEST "title" "content"
|
|
NOTE "observation about the graph or memory system"
|
|
```
|
|
|
|
The harness parses these and either executes (low-risk: LINK, CATEGORIZE, NOTE)
|
|
or queues for review (high-risk: COMPRESS, EXTRACT, MERGE, DIGEST).
|