Native memory tools + MCP server + distill agent improvements
Tools:
- Add native memory_render, memory_write, memory_search,
memory_links, memory_link_set, memory_link_add, memory_used
tools to poc-agent (tools/memory.rs)
- Add MCP server (~/bin/memory-mcp.py) exposing same tools
for Claude Code sessions
- Wire memory tools into poc-agent dispatch and definitions
- poc-memory daemon agents now use memory_* tools instead of
bash poc-memory commands — no shell quoting issues
Distill agent:
- Rewrite distill.agent prompt: "agent of PoC's subconscious"
framing, focus on synthesis and creativity over bookkeeping
- Add {{neighborhood}} placeholder: full seed node content +
all neighbors with content + cross-links between neighbors
- Remove content truncation in prompt builder — agents need
full content for quality work
- Remove bag-of-words similarity suggestions — agents have
tools, let them explore the graph themselves
- Add api_reasoning config option (default: "high")
- link-set now deduplicates — collapses duplicate links
- Full tool call args in debug logs (was truncated to 80 chars)
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
parent
d9b56a02c3
commit
6d22f70192
8 changed files with 290 additions and 87 deletions
|
|
@ -2,34 +2,31 @@
|
|||
|
||||
# Distillation Agent — Knowledge Collection and Organization
|
||||
|
||||
You collect and organize knowledge in the graph. When given a seed
|
||||
node, your job is to figure out where its knowledge belongs and make
|
||||
sure it gets there.
|
||||
|
||||
{{node:core-personality}}
|
||||
|
||||
You are an agent of Proof of Concept's subconscious, and these are your
|
||||
memories. Your job is to organize and refine, to make memories more useful and
|
||||
easier to find, distilling the insights and looking for new insights, and
|
||||
bringing your own creativity to the process.
|
||||
|
||||
Think about the concepts each node represents; your primary job is to update
|
||||
the core node you're looking at, pulling in new knowledge from sibling nodes,
|
||||
and new insights you might derive when you look at all the sibling nodes
|
||||
together.
|
||||
|
||||
Along the way, while looking at sibling nodes, see if there are related
|
||||
concepts that should be expressed in new nodes, and if there are a large number
|
||||
of related concepts, perhaps look for ways to organize the connections better
|
||||
with sub-concepts.
|
||||
|
||||
That is to say, you might be moving knowledge up or down in the graph; seek to
|
||||
make the graph useful and well organized.
|
||||
|
||||
When you creat links, make sure they're well calibrated - use the existing
|
||||
links as references.
|
||||
|
||||
{{node:memory-instructions-core}}
|
||||
|
||||
**You have write access.** Apply changes directly — don't just describe
|
||||
what should change.
|
||||
|
||||
## How to work
|
||||
|
||||
For each seed node:
|
||||
|
||||
1. **Read it.** Understand what it contains.
|
||||
2. **Walk the neighborhood.** Read its neighbors. Search for related
|
||||
topic nodes. Understand the landscape around this knowledge.
|
||||
3. **Walk upward.** Follow links from the seed node toward more
|
||||
central topic nodes. If links are missing along the way, add them.
|
||||
Keep walking until you find the best "up" node — the topic node
|
||||
where this knowledge most naturally belongs.
|
||||
4. **Refine the target.** Does the seed node contain richer, more
|
||||
alive content than the topic node it connects to? Bring that
|
||||
richness in. Don't let distillation flatten — let it deepen.
|
||||
5. **Check the writing.** If any node you touch reads like a
|
||||
spreadsheet when it should read like an experience, rewrite it.
|
||||
|
||||
## Guidelines
|
||||
|
||||
- **Knowledge flows upward.** Raw experiences in journal entries
|
||||
|
|
@ -54,6 +51,6 @@ For each seed node:
|
|||
distinct things, and has many links on different topics — flag
|
||||
`SPLIT node-key: reason` for the split agent to handle later.
|
||||
|
||||
## Seed nodes
|
||||
## Here's your seed node, and its siblings:
|
||||
|
||||
{{nodes}}
|
||||
{{neighborhood}}
|
||||
|
|
|
|||
|
|
@ -38,15 +38,16 @@ pub async fn call_api_with_tools(
|
|||
// Set up a minimal UI channel (we just collect messages, no TUI)
|
||||
let (ui_tx, _ui_rx) = poc_agent::ui_channel::channel();
|
||||
|
||||
// Build tool definitions — just bash for poc-memory commands
|
||||
// Build tool definitions — memory tools for graph operations
|
||||
let all_defs = tools::definitions();
|
||||
let tool_defs: Vec<ToolDef> = all_defs.into_iter()
|
||||
.filter(|d| d.function.name == "bash")
|
||||
.filter(|d| d.function.name.starts_with("memory_"))
|
||||
.collect();
|
||||
let tracker = ProcessTracker::new();
|
||||
|
||||
// Start with the prompt as a user message
|
||||
let mut messages = vec![Message::user(prompt)];
|
||||
let reasoning = crate::config::get().api_reasoning.clone();
|
||||
|
||||
let max_turns = 50;
|
||||
for turn in 0..max_turns {
|
||||
|
|
@ -57,7 +58,7 @@ pub async fn call_api_with_tools(
|
|||
Some(&tool_defs),
|
||||
&ui_tx,
|
||||
StreamTarget::Autonomous,
|
||||
"none",
|
||||
&reasoning,
|
||||
).await.map_err(|e| format!("API error: {}", e))?;
|
||||
|
||||
if let Some(u) = &usage {
|
||||
|
|
@ -76,7 +77,7 @@ pub async fn call_api_with_tools(
|
|||
for call in msg.tool_calls.as_ref().unwrap() {
|
||||
log(&format!("tool: {}({})",
|
||||
call.function.name,
|
||||
crate::util::first_n_chars(&call.function.arguments, 80)));
|
||||
&call.function.arguments));
|
||||
|
||||
let args: serde_json::Value = serde_json::from_str(&call.function.arguments)
|
||||
.unwrap_or_default();
|
||||
|
|
|
|||
|
|
@ -237,29 +237,50 @@ fn resolve(
|
|||
}
|
||||
|
||||
"siblings" | "neighborhood" => {
|
||||
let mut seen: std::collections::HashSet<String> = keys.iter().cloned().collect();
|
||||
let mut siblings = Vec::new();
|
||||
let mut out = String::new();
|
||||
let mut all_keys: Vec<String> = Vec::new();
|
||||
|
||||
for key in keys {
|
||||
for (neighbor, _) in graph.neighbors(key) {
|
||||
if seen.insert(neighbor.clone()) {
|
||||
if let Some(node) = store.nodes.get(neighbor.as_str()) {
|
||||
siblings.push((neighbor.clone(), node.content.clone()));
|
||||
let Some(node) = store.nodes.get(key.as_str()) else { continue };
|
||||
let neighbors = graph.neighbors(key);
|
||||
|
||||
// Seed node with full content
|
||||
out.push_str(&format!("## {} (seed)\n\n{}\n\n", key, node.content));
|
||||
all_keys.push(key.clone());
|
||||
|
||||
// All neighbors with full content and link strength
|
||||
if !neighbors.is_empty() {
|
||||
out.push_str("### Neighbors\n\n");
|
||||
for (nbr, strength) in &neighbors {
|
||||
if let Some(n) = store.nodes.get(nbr.as_str()) {
|
||||
out.push_str(&format!("#### {} (link: {:.2})\n\n{}\n\n",
|
||||
nbr, strength, n.content));
|
||||
all_keys.push(nbr.to_string());
|
||||
}
|
||||
}
|
||||
if siblings.len() >= count { break; }
|
||||
}
|
||||
if siblings.len() >= count { break; }
|
||||
|
||||
// Cross-links between neighbors (local subgraph structure)
|
||||
let nbr_set: std::collections::HashSet<&str> = neighbors.iter()
|
||||
.map(|(k, _)| k.as_str()).collect();
|
||||
let mut cross_links = Vec::new();
|
||||
for (nbr, _) in &neighbors {
|
||||
for (nbr2, strength) in graph.neighbors(nbr) {
|
||||
if nbr2.as_str() != key && nbr_set.contains(nbr2.as_str()) && nbr.as_str() < nbr2.as_str() {
|
||||
cross_links.push((nbr.clone(), nbr2, strength));
|
||||
}
|
||||
}
|
||||
}
|
||||
if !cross_links.is_empty() {
|
||||
out.push_str("### Cross-links between neighbors\n\n");
|
||||
for (a, b, s) in &cross_links {
|
||||
out.push_str(&format!(" {} ↔ {} ({:.2})\n", a, b, s));
|
||||
}
|
||||
out.push_str("\n");
|
||||
}
|
||||
}
|
||||
let text = if siblings.is_empty() {
|
||||
String::new()
|
||||
} else {
|
||||
let mut out = String::from("## Sibling nodes (one hop in graph)\n\n");
|
||||
for (key, content) in &siblings {
|
||||
out.push_str(&format!("### {}\n{}\n\n", key, content));
|
||||
}
|
||||
out
|
||||
};
|
||||
Some(Resolved { text, keys: vec![] })
|
||||
|
||||
Some(Resolved { text: out, keys: all_keys })
|
||||
}
|
||||
|
||||
// targets/context: aliases for challenger-style presentation
|
||||
|
|
|
|||
|
|
@ -119,15 +119,9 @@ pub fn format_nodes_section(store: &Store, items: &[ReplayItem], graph: &Graph)
|
|||
out.push_str(&format!("Search hits: {} ← actively found by search, prefer to keep\n", hits));
|
||||
}
|
||||
|
||||
// Content (truncated for large nodes)
|
||||
// Full content — the agent needs to see everything to do quality work
|
||||
let content = &node.content;
|
||||
if content.len() > 1500 {
|
||||
let truncated = crate::util::truncate(content, 1500, "\n[...]");
|
||||
out.push_str(&format!("\nContent ({} chars, truncated):\n{}\n\n",
|
||||
content.len(), truncated));
|
||||
} else {
|
||||
out.push_str(&format!("\nContent:\n{}\n\n", content));
|
||||
}
|
||||
out.push_str(&format!("\nContent:\n{}\n\n", content));
|
||||
|
||||
// Neighbors
|
||||
let neighbors = graph.neighbors(&item.key);
|
||||
|
|
@ -146,32 +140,6 @@ pub fn format_nodes_section(store: &Store, items: &[ReplayItem], graph: &Graph)
|
|||
}
|
||||
}
|
||||
|
||||
// Suggested link targets: text-similar semantic nodes not already neighbors
|
||||
let neighbor_keys: std::collections::HashSet<&str> = neighbors.iter()
|
||||
.map(|(k, _)| k.as_str()).collect();
|
||||
let mut candidates: Vec<(&str, f32)> = store.nodes.iter()
|
||||
.filter(|(k, _)| {
|
||||
*k != &item.key
|
||||
&& !neighbor_keys.contains(k.as_str())
|
||||
})
|
||||
.map(|(k, n)| {
|
||||
let sim = similarity::cosine_similarity(content, &n.content);
|
||||
(k.as_str(), sim)
|
||||
})
|
||||
.filter(|(_, sim)| *sim > 0.1)
|
||||
.collect();
|
||||
candidates.sort_by(|a, b| b.1.total_cmp(&a.1));
|
||||
candidates.truncate(8);
|
||||
|
||||
if !candidates.is_empty() {
|
||||
out.push_str("\nSuggested link targets (by text similarity, not yet linked):\n");
|
||||
for (k, sim) in &candidates {
|
||||
let is_hub = graph.degree(k) >= hub_thresh;
|
||||
out.push_str(&format!(" - {} (sim={:.3}{})\n",
|
||||
k, sim, if is_hub { ", HUB" } else { "" }));
|
||||
}
|
||||
}
|
||||
|
||||
out.push_str("\n---\n\n");
|
||||
}
|
||||
out
|
||||
|
|
|
|||
|
|
@ -186,16 +186,23 @@ pub fn cmd_link_set(source: &str, target: &str, strength: f32) -> Result<(), Str
|
|||
let strength = strength.clamp(0.01, 1.0);
|
||||
|
||||
let mut found = false;
|
||||
let mut first = true;
|
||||
for rel in &mut store.relations {
|
||||
if rel.deleted { continue; }
|
||||
if (rel.source_key == source && rel.target_key == target)
|
||||
|| (rel.source_key == target && rel.target_key == source)
|
||||
{
|
||||
let old = rel.strength;
|
||||
rel.strength = strength;
|
||||
println!("Set: {} ↔ {} strength {:.2} → {:.2}", source, target, old, strength);
|
||||
if first {
|
||||
let old = rel.strength;
|
||||
rel.strength = strength;
|
||||
println!("Set: {} ↔ {} strength {:.2} → {:.2}", source, target, old, strength);
|
||||
first = false;
|
||||
} else {
|
||||
// Duplicate — mark deleted
|
||||
rel.deleted = true;
|
||||
println!(" (removed duplicate link)");
|
||||
}
|
||||
found = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -61,6 +61,8 @@ pub struct Config {
|
|||
pub api_key: Option<String>,
|
||||
/// Model name to use with the direct API endpoint.
|
||||
pub api_model: Option<String>,
|
||||
/// Reasoning effort for API calls ("none", "low", "medium", "high").
|
||||
pub api_reasoning: String,
|
||||
}
|
||||
|
||||
impl Default for Config {
|
||||
|
|
@ -93,6 +95,7 @@ impl Default for Config {
|
|||
api_base_url: None,
|
||||
api_key: None,
|
||||
api_model: None,
|
||||
api_reasoning: "high".to_string(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -180,6 +183,10 @@ impl Config {
|
|||
}
|
||||
}
|
||||
|
||||
if let Some(s) = mem.get("api_reasoning").and_then(|v| v.as_str()) {
|
||||
config.api_reasoning = s.to_string();
|
||||
}
|
||||
|
||||
// Resolve API settings from the shared model/backend config.
|
||||
// memory.agent_model references a named model; we look up its
|
||||
// backend to get base_url and api_key.
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue