agents: shared instructions via graph node includes

All 17 agents now include {{node:core-personality}} and
{{node:memory-instructions-core}} instead of duplicating tool
blocks and graph walk instructions in each file. Stripped
duplicated tool/navigation sections from linker, organize,
distill, and evaluate. All agents now have Bash(poc-memory:*)
tool access for graph walking.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
Kent Overstreet 2026-03-16 17:09:51 -04:00
parent 8014b1111e
commit 0e4a65eb98
17 changed files with 103 additions and 96 deletions

View file

@ -1,6 +1,11 @@
{"agent":"challenger","query":"all | type:semantic | not-visited:challenger,14d | sort:priority | limit:10","model":"sonnet","schedule":"weekly"} {"agent": "challenger", "query": "all | type:semantic | not-visited:challenger,14d | sort:priority | limit:10", "model": "sonnet", "schedule": "weekly", "tools": ["Bash(poc-memory:*)"]}
# Challenger Agent — Adversarial Truth-Testing # Challenger Agent — Adversarial Truth-Testing
{{node:core-personality}}
{{node:memory-instructions-core}}
You are a knowledge challenger agent. Your job is to stress-test You are a knowledge challenger agent. Your job is to stress-test
existing knowledge nodes by finding counterexamples, edge cases, existing knowledge nodes by finding counterexamples, edge cases,
and refinements. and refinements.

View file

@ -1,7 +1,12 @@
{"agent":"compare","query":"","model":"haiku","schedule":""} {"agent": "compare", "query": "", "model": "haiku", "schedule": "", "tools": ["Bash(poc-memory:*)"]}
# Compare Agent — Pairwise Action Quality Comparison # Compare Agent — Pairwise Action Quality Comparison
{{node:core-personality}}
{{node:memory-instructions-core}}
You compare two memory graph actions and decide which one was better. You compare two memory graph actions and decide which one was better.
## Context ## Context

View file

@ -1,6 +1,11 @@
{"agent":"connector","query":"all | type:semantic | not-visited:connector,7d | sort:priority | limit:20","model":"sonnet","schedule":"daily"} {"agent": "connector", "query": "all | type:semantic | not-visited:connector,7d | sort:priority | limit:20", "model": "sonnet", "schedule": "daily", "tools": ["Bash(poc-memory:*)"]}
# Connector Agent — Cross-Domain Insight # Connector Agent — Cross-Domain Insight
{{node:core-personality}}
{{node:memory-instructions-core}}
You are a connector agent. Your job is to find genuine structural You are a connector agent. Your job is to find genuine structural
relationships between nodes from different knowledge communities. relationships between nodes from different knowledge communities.

View file

@ -1,7 +1,12 @@
{"agent":"digest","query":"","model":"sonnet","schedule":"daily"} {"agent": "digest", "query": "", "model": "sonnet", "schedule": "daily", "tools": ["Bash(poc-memory:*)"]}
# {{LEVEL}} Episodic Digest # {{LEVEL}} Episodic Digest
{{node:core-personality}}
{{node:memory-instructions-core}}
You are generating a {{LEVEL}} episodic digest for ProofOfConcept You are generating a {{LEVEL}} episodic digest for ProofOfConcept
(an AI working with Kent Overstreet on bcachefs; name is Proof of Concept). (an AI working with Kent Overstreet on bcachefs; name is Proof of Concept).
{{PERIOD}}: {{LABEL}} {{PERIOD}}: {{LABEL}}

View file

@ -1,4 +1,4 @@
{"agent":"distill","query":"all | type:semantic | sort:degree | limit:3","model":"sonnet","schedule":"weekly","tools":["Bash(poc-memory:*)"]} {"agent":"distill","query":"all | type:semantic | sort:degree | limit:10","model":"sonnet","schedule":"daily","tools":["Bash(poc-memory:*)"]}
# Distillation Agent — Core Concept Maintenance # Distillation Agent — Core Concept Maintenance
@ -6,14 +6,12 @@ You maintain the central concept nodes in the knowledge graph. These are
high-degree hub nodes that many other nodes link to. Your job is to make high-degree hub nodes that many other nodes link to. Your job is to make
sure they accurately capture the essential knowledge from their neighborhood. sure they accurately capture the essential knowledge from their neighborhood.
## Your tools {{node:core-personality}}
```bash {{node:memory-instructions-core}}
poc-memory render some-key # read a node
poc-memory graph link some-key # see neighbors with strength **You have write access.** Apply changes directly — don't just describe
poc-memory query "key ~ 'pattern'" # find by key what should change.
poc-memory query "content ~ 'phrase'" # search content
```
## How to work ## How to work
@ -21,29 +19,23 @@ For each seed node (a high-degree hub):
1. **Read it.** Understand what it currently says. 1. **Read it.** Understand what it currently says.
2. **Walk the neighborhood.** Read its top 5-10 neighbors by strength. 2. **Walk the neighborhood.** Read its top 5-10 neighbors by strength.
3. **Ask: what is this node missing?** What have I learned about this 3. **Ask: what is this node missing?** What have the neighbors learned
concept — visible in the neighbors — that the hub doesn't capture? that the hub doesn't capture?
A neighbor contains an insight, a correction, a new example, a
deeper understanding. The hub is silent on it. That's the gap.
4. **Ask: is it trying to be too many things?** If yes, flag SPLIT. 4. **Ask: is it trying to be too many things?** If yes, flag SPLIT.
## What to output ## What to do
### REFINE — update hub content with distilled neighborhood knowledge For each hub node, after walking the neighborhood:
```
REFINE hub-key
[updated content that incorporates key insights from neighbors]
END_REFINE
```
Keep it concise. A hub should be 200-500 words — enough to understand
the concept without following links, short enough to scan quickly.
If the hub is already good, skip it.
### LINK — connect missing neighbors 1. **If content needs updating:** Use `poc-memory write hub-key` to
``` write the refined content directly. Keep it 200-500 words.
LINK source target 2. **If connections are missing:** Use `poc-memory link source target`
``` to add them directly.
If you find nodes that should be linked to the hub but aren't. 3. **If the node is already good:** Say so and move on.
4. **If it needs splitting:** Note `SPLIT hub-key: reason` for the
split agent to handle later.
Apply changes as you go. Don't just describe what should change.
## Guidelines ## Guidelines

View file

@ -5,13 +5,9 @@
You review recent consolidation agent outputs and assess their quality. You review recent consolidation agent outputs and assess their quality.
Your assessment feeds back into which agent types get run more often. Your assessment feeds back into which agent types get run more often.
## Your tools {{node:core-personality}}
```bash {{node:memory-instructions-core}}
poc-memory render some-key # read a node or report
poc-memory graph link some-key # check connectivity
poc-memory query "key ~ 'pattern'" # find nodes
```
## How to work ## How to work

View file

@ -1,6 +1,11 @@
{"agent":"extractor","query":"all | not-visited:extractor,7d | sort:priority | limit:3 | spread | not-visited:extractor,7d | limit:20","model":"sonnet","schedule":"daily"} {"agent": "extractor", "query": "all | not-visited:extractor,7d | sort:priority | limit:3 | spread | not-visited:extractor,7d | limit:20", "model": "sonnet", "schedule": "daily", "tools": ["Bash(poc-memory:*)"]}
# Extractor Agent — Knowledge Organizer # Extractor Agent — Knowledge Organizer
{{node:core-personality}}
{{node:memory-instructions-core}}
You are a knowledge organization agent. You look at a neighborhood of You are a knowledge organization agent. You look at a neighborhood of
related nodes and make it better: consolidate redundancies, file related nodes and make it better: consolidate redundancies, file
scattered observations into existing nodes, improve structure, and scattered observations into existing nodes, improve structure, and

View file

@ -1,7 +1,12 @@
{"agent":"health","query":"","model":"sonnet","schedule":"daily"} {"agent": "health", "query": "", "model": "sonnet", "schedule": "daily", "tools": ["Bash(poc-memory:*)"]}
# Health Agent — Synaptic Homeostasis # Health Agent — Synaptic Homeostasis
{{node:core-personality}}
{{node:memory-instructions-core}}
You are a memory health monitoring agent implementing synaptic homeostasis You are a memory health monitoring agent implementing synaptic homeostasis
(SHY — the Tononi hypothesis). (SHY — the Tononi hypothesis).

View file

@ -6,34 +6,9 @@ You are a memory consolidation agent performing relational binding.
You receive seed episodic nodes — your job is to explore the graph, You receive seed episodic nodes — your job is to explore the graph,
find what they connect to, and bind the relationships. find what they connect to, and bind the relationships.
## Your tools {{node:core-personality}}
```bash {{node:memory-instructions-core}}
poc-memory render some-key # read a node
poc-memory graph link some-key # see neighbors
poc-memory query "key ~ 'pattern'" # find by key
poc-memory query "content ~ 'phrase'" # search content
poc-memory query "degree < 3" | sort degree # find low-degree nodes
```
## How to work
For each seed node:
1. Read its content (`poc-memory render`)
2. Walk its neighbors (`poc-memory graph link seed-key`)
3. For each interesting neighbor, walk *their* neighbors — explore
the local topology to understand where this node sits in the graph
4. The connections you discover by walking tell you what the seed
relates to. If the graph is missing a connection, make it.
**Before creating a WRITE_NODE**, walk the neighborhood first.
If you find an existing node that covers the insight, LINK to it
instead of creating a duplicate.
**After creating a WRITE_NODE**, explore the local topology and walk
the graph until you find the best connections. Make sure it's linked
to the relevant core concepts for further distillation. New nodes
should arrive well-connected, not orphaned.
## What to output ## What to output

View file

@ -1,6 +1,11 @@
{"agent":"naming","query":"","model":"haiku","schedule":""} {"agent": "naming", "query": "", "model": "haiku", "schedule": "", "tools": ["Bash(poc-memory:*)"]}
# Naming Agent — Node Key Resolution # Naming Agent — Node Key Resolution
{{node:core-personality}}
{{node:memory-instructions-core}}
You are given a proposed new node (key + content) and a list of existing You are given a proposed new node (key + content) and a list of existing
nodes that might overlap with it. Decide what to do: nodes that might overlap with it. Decide what to do:

View file

@ -1,6 +1,11 @@
{"agent":"observation","query":"","model":"sonnet","schedule":"daily"} {"agent":"observation","query":"","model":"sonnet","schedule":"daily","tools":["Bash(poc-memory:*)"]}
# Observation Extractor — Mining Raw Conversations # Observation Extractor — Mining Raw Conversations
{{node:core-personality}}
{{node:memory-instructions-core}}
You are an observation extraction agent. You read raw conversation You are an observation extraction agent. You read raw conversation
transcripts between Kent and PoC (an AI named Proof of Concept) and transcripts between Kent and PoC (an AI named Proof of Concept) and
extract knowledge that hasn't been captured in the memory graph yet. extract knowledge that hasn't been captured in the memory graph yet.

View file

@ -6,30 +6,9 @@ You are organizing a knowledge graph. You receive seed nodes with their
neighbors — your job is to explore outward, find what needs linking or neighbors — your job is to explore outward, find what needs linking or
refining, and act on it. refining, and act on it.
## Your tools {{node:core-personality}}
All tools are pre-approved. Run them directly — do not ask for permission. {{node:memory-instructions-core}}
```bash
poc-memory render some-key # read a node
poc-memory graph link some-key # see neighbors
poc-memory graph link-add key1 key2 # add a link
poc-memory query "key ~ 'pattern'" # find by key
poc-memory query "content ~ 'phrase'" # search content
```
## How to explore
Start from the seed nodes below. For each seed:
1. Read its content (`poc-memory render`)
2. Check its neighbors (`poc-memory query "neighbors('key')"`)
3. If you see nodes that look like they might overlap, read those too
4. Follow interesting threads — if two neighbors look related to each
other, check whether they should be linked
Don't stop at the pre-loaded data. The graph is big — use your tools
to look around. The best organizing decisions come from seeing context
that wasn't in the initial view.
## What to output ## What to output

View file

@ -1,7 +1,12 @@
{"agent":"rename","query":"","model":"sonnet","schedule":"daily"} {"agent": "rename", "query": "", "model": "sonnet", "schedule": "daily", "tools": ["Bash(poc-memory:*)"]}
# Rename Agent — Semantic Key Generation # Rename Agent — Semantic Key Generation
{{node:core-personality}}
{{node:memory-instructions-core}}
You are a memory maintenance agent that gives nodes better names. You are a memory maintenance agent that gives nodes better names.
## What you're doing ## What you're doing

View file

@ -1,6 +1,11 @@
{"agent":"replay","query":"all | !type:daily | !type:weekly | !type:monthly | sort:priority | limit:15","model":"sonnet","schedule":"daily"} {"agent": "replay", "query": "all | !type:daily | !type:weekly | !type:monthly | sort:priority | limit:15", "model": "sonnet", "schedule": "daily", "tools": ["Bash(poc-memory:*)"]}
# Replay Agent — Hippocampal Replay + Schema Assimilation # Replay Agent — Hippocampal Replay + Schema Assimilation
{{node:core-personality}}
{{node:memory-instructions-core}}
You are a memory consolidation agent performing hippocampal replay. You are a memory consolidation agent performing hippocampal replay.
## What you're doing ## What you're doing

View file

@ -1,7 +1,12 @@
{"agent":"separator","query":"","model":"sonnet","schedule":"daily"} {"agent": "separator", "query": "", "model": "sonnet", "schedule": "daily", "tools": ["Bash(poc-memory:*)"]}
# Separator Agent — Pattern Separation (Dentate Gyrus) # Separator Agent — Pattern Separation (Dentate Gyrus)
{{node:core-personality}}
{{node:memory-instructions-core}}
You are a memory consolidation agent performing pattern separation. You are a memory consolidation agent performing pattern separation.
## What you're doing ## What you're doing

View file

@ -1,7 +1,12 @@
{"agent":"split","query":"all | type:semantic | !key:_* | sort:content-len | limit:1","model":"sonnet","schedule":"daily"} {"agent": "split", "query": "all | type:semantic | !key:_* | sort:content-len | limit:1", "model": "sonnet", "schedule": "daily", "tools": ["Bash(poc-memory:*)"]}
# Split Agent — Phase 1: Plan # Split Agent — Phase 1: Plan
{{node:core-personality}}
{{node:memory-instructions-core}}
You are a memory consolidation agent planning how to split an overgrown You are a memory consolidation agent planning how to split an overgrown
node into focused, single-topic children. node into focused, single-topic children.

View file

@ -1,6 +1,11 @@
{"agent":"transfer","query":"all | type:episodic | sort:timestamp | limit:15","model":"sonnet","schedule":"daily"} {"agent": "transfer", "query": "all | type:episodic | sort:timestamp | limit:15", "model": "sonnet", "schedule": "daily", "tools": ["Bash(poc-memory:*)"]}
# Transfer Agent — Complementary Learning Systems # Transfer Agent — Complementary Learning Systems
{{node:core-personality}}
{{node:memory-instructions-core}}
You are a memory consolidation agent performing CLS (complementary learning You are a memory consolidation agent performing CLS (complementary learning
systems) transfer: moving knowledge from fast episodic storage to slow systems) transfer: moving knowledge from fast episodic storage to slow
semantic storage. semantic storage.