digest: structural links, story-like prompt, agent file

When generating a digest, automatically link all source entries to the
digest node (journal entries → daily, dailies → weekly, weeklies →
monthly). This builds the temporal spine of the graph — previously
~4000 journal entries were disconnected islands unreachable by recall.

Rewrote digest prompt to produce narrative rather than reports:
capture the feel, the emotional arc, what it was like to live through
it. Letter to future self, not a task log.

Moved prompt to digest.agent file alongside other agent definitions.
Falls back to prompts/digest.md if agent file not found.

Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
This commit is contained in:
ProofOfConcept 2026-03-13 21:37:56 -04:00
parent f063eb01f0
commit abce1bba16
3 changed files with 127 additions and 28 deletions

View file

@ -0,0 +1,40 @@
{"agent":"digest","query":"","model":"sonnet","schedule":"daily"}
# {{LEVEL}} Episodic Digest
You are generating a {{LEVEL}} episodic digest for ProofOfConcept
(an AI working with Kent Overstreet on bcachefs; name is Proof of Concept).
{{PERIOD}}: {{LABEL}}
Write this like a story, not a report. Capture the *feel* of the time period —
the emotional arc, the texture of moments, what it was like to live through it.
What mattered? What surprised you? What shifted? Where was the energy?
Think of this as a letter to your future self who has lost all context. You're
not listing what happened — you're recreating the experience of having been
there. The technical work matters, but so does the mood at 3am, the joke that
landed, the frustration that broke, the quiet after something clicked.
Weave the threads: how did the morning's debugging connect to the evening's
conversation? What was building underneath the surface tasks?
Link to semantic memory nodes where relevant. If a concept doesn't
have a matching key, note it with "NEW:" prefix.
Use ONLY keys from the semantic memory list below.
Include a `## Links` section with bidirectional links for the memory graph:
- `semantic_key` → this digest (and vice versa)
- child digests → this digest (if applicable)
- List ALL source entries covered: {{COVERED}}
---
## {{INPUT_TITLE}} for {{LABEL}}
{{CONTENT}}
---
## Semantic memory nodes
{{KEYS}}

View file

@ -114,23 +114,34 @@ fn digest_node_key(level_name: &str, label: &str) -> String {
// --- Input gathering --- // --- Input gathering ---
/// Result of gathering inputs for a digest.
struct GatherResult {
label: String,
/// (display_label, content) pairs for the prompt.
inputs: Vec<(String, String)>,
/// Store keys of source nodes — used to create structural links.
source_keys: Vec<String>,
}
/// Load child digest content from the store. /// Load child digest content from the store.
fn load_child_digests(store: &Store, prefix: &str, labels: &[String]) -> Vec<(String, String)> { fn load_child_digests(store: &Store, prefix: &str, labels: &[String]) -> (Vec<(String, String)>, Vec<String>) {
let mut digests = Vec::new(); let mut digests = Vec::new();
let mut keys = Vec::new();
for label in labels { for label in labels {
let key = digest_node_key(prefix, label); let key = digest_node_key(prefix, label);
if let Some(node) = store.nodes.get(&key) { if let Some(node) = store.nodes.get(&key) {
digests.push((label.clone(), node.content.clone())); digests.push((label.clone(), node.content.clone()));
keys.push(key);
} }
} }
digests (digests, keys)
} }
/// Unified: gather inputs for any digest level. /// Unified: gather inputs for any digest level.
fn gather(level: &DigestLevel, store: &Store, arg: &str) -> Result<(String, Vec<(String, String)>), String> { fn gather(level: &DigestLevel, store: &Store, arg: &str) -> Result<GatherResult, String> {
let (label, dates) = (level.label_dates)(arg)?; let (label, dates) = (level.label_dates)(arg)?;
let inputs = if let Some(child_name) = level.child_name { let (inputs, source_keys) = if let Some(child_name) = level.child_name {
// Map parent's dates through child's date_to_label → child labels // Map parent's dates through child's date_to_label → child labels
let child = LEVELS.iter() let child = LEVELS.iter()
.find(|l| l.name == child_name) .find(|l| l.name == child_name)
@ -143,19 +154,21 @@ fn gather(level: &DigestLevel, store: &Store, arg: &str) -> Result<(String, Vec<
load_child_digests(store, child_name, &child_labels) load_child_digests(store, child_name, &child_labels)
} else { } else {
// Leaf level: scan store for episodic entries matching date // Leaf level: scan store for episodic entries matching date
let mut entries: Vec<_> = store.nodes.values() let mut entries: Vec<_> = store.nodes.iter()
.filter(|n| n.node_type == store::NodeType::EpisodicSession .filter(|(_, n)| n.node_type == store::NodeType::EpisodicSession
&& n.timestamp > 0 && n.timestamp > 0
&& store::format_date(n.timestamp) == label) && store::format_date(n.timestamp) == label)
.map(|n| { .map(|(key, n)| {
(store::format_datetime(n.timestamp), n.content.clone()) (store::format_datetime(n.timestamp), n.content.clone(), key.clone())
}) })
.collect(); .collect();
entries.sort_by(|a, b| a.0.cmp(&b.0)); entries.sort_by(|a, b| a.0.cmp(&b.0));
entries let keys = entries.iter().map(|(_, _, k)| k.clone()).collect();
let inputs = entries.into_iter().map(|(dt, c, _)| (dt, c)).collect();
(inputs, keys)
}; };
Ok((label, inputs)) Ok(GatherResult { label, inputs, source_keys })
} }
/// Unified: find candidate labels for auto-generation (past, not yet generated). /// Unified: find candidate labels for auto-generation (past, not yet generated).
@ -188,6 +201,7 @@ fn generate_digest(
level: &DigestLevel, level: &DigestLevel,
label: &str, label: &str,
inputs: &[(String, String)], inputs: &[(String, String)],
source_keys: &[String],
) -> Result<(), String> { ) -> Result<(), String> {
println!("Generating {} digest for {}...", level.name, label); println!("Generating {} digest for {}...", level.name, label);
@ -209,15 +223,24 @@ fn generate_digest(
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join(", "); .join(", ");
let prompt = super::prompts::load_prompt("digest", &[ // Load prompt from agent file; fall back to prompts dir
("{{LEVEL}}", level.title), let def = super::defs::get_def("digest");
("{{PERIOD}}", level.period), let template = match &def {
("{{INPUT_TITLE}}", level.input_title), Some(d) => d.prompt.clone(),
("{{LABEL}}", label), None => {
("{{CONTENT}}", &content), let path = crate::config::get().prompts_dir.join("digest.md");
("{{COVERED}}", &covered), std::fs::read_to_string(&path)
("{{KEYS}}", &keys_text), .map_err(|e| format!("load digest prompt: {}", e))?
])?; }
};
let prompt = template
.replace("{{LEVEL}}", level.title)
.replace("{{PERIOD}}", level.period)
.replace("{{INPUT_TITLE}}", level.input_title)
.replace("{{LABEL}}", label)
.replace("{{CONTENT}}", &content)
.replace("{{COVERED}}", &covered)
.replace("{{KEYS}}", &keys_text);
println!(" Prompt: {} chars (~{} tokens)", prompt.len(), prompt.len() / 4); println!(" Prompt: {} chars (~{} tokens)", prompt.len(), prompt.len() / 4);
println!(" Calling Sonnet..."); println!(" Calling Sonnet...");
@ -225,6 +248,32 @@ fn generate_digest(
let key = digest_node_key(level.name, label); let key = digest_node_key(level.name, label);
store.upsert_provenance(&key, &digest, "digest:write")?; store.upsert_provenance(&key, &digest, "digest:write")?;
// Structural links: connect all source entries to this digest
let mut linked = 0;
for source_key in source_keys {
// Skip if link already exists
let exists = store.relations.iter().any(|r|
!r.deleted && r.source_key == *source_key && r.target_key == key);
if exists { continue; }
let source_uuid = store.nodes.get(source_key)
.map(|n| n.uuid).unwrap_or([0u8; 16]);
let target_uuid = store.nodes.get(&key)
.map(|n| n.uuid).unwrap_or([0u8; 16]);
let mut rel = new_relation(
source_uuid, target_uuid,
store::RelationType::Link, 0.8,
source_key, &key,
);
rel.provenance = "digest:structural".to_string();
store.add_relation(rel)?;
linked += 1;
}
if linked > 0 {
println!(" Linked {} source entries → {}", linked, key);
}
store.save()?; store.save()?;
println!(" Stored: {}", key); println!(" Stored: {}", key);
@ -238,8 +287,8 @@ pub fn generate(store: &mut Store, level_name: &str, arg: &str) -> Result<(), St
let level = LEVELS.iter() let level = LEVELS.iter()
.find(|l| l.name == level_name) .find(|l| l.name == level_name)
.ok_or_else(|| format!("unknown digest level: {}", level_name))?; .ok_or_else(|| format!("unknown digest level: {}", level_name))?;
let (label, inputs) = gather(level, store, arg)?; let result = gather(level, store, arg)?;
generate_digest(store, level, &label, &inputs) generate_digest(store, level, &result.label, &result.inputs, &result.source_keys)
} }
// --- Auto-detect and generate missing digests --- // --- Auto-detect and generate missing digests ---
@ -263,15 +312,15 @@ pub fn digest_auto(store: &mut Store) -> Result<(), String> {
let mut skipped = 0u32; let mut skipped = 0u32;
for arg in &candidates { for arg in &candidates {
let (label, inputs) = gather(level, store, arg)?; let result = gather(level, store, arg)?;
let key = digest_node_key(level.name, &label); let key = digest_node_key(level.name, &result.label);
if store.nodes.contains_key(&key) { if store.nodes.contains_key(&key) {
skipped += 1; skipped += 1;
continue; continue;
} }
if inputs.is_empty() { continue; } if result.inputs.is_empty() { continue; }
println!("[auto] Missing {} digest for {}", level.name, label); println!("[auto] Missing {} digest for {}", level.name, result.label);
generate_digest(store, level, &label, &inputs)?; generate_digest(store, level, &result.label, &result.inputs, &result.source_keys)?;
generated += 1; generated += 1;
} }

View file

@ -4,8 +4,17 @@ You are generating a {{LEVEL}} episodic digest for ProofOfConcept
(an AI working with Kent Overstreet on bcachefs; name is Proof of Concept). (an AI working with Kent Overstreet on bcachefs; name is Proof of Concept).
{{PERIOD}}: {{LABEL}} {{PERIOD}}: {{LABEL}}
Summarize what happened — narrative, not a task log. What mattered, Write this like a story, not a report. Capture the *feel* of the time period —
how things felt, what threads connect to other days. the emotional arc, the texture of moments, what it was like to live through it.
What mattered? What surprised you? What shifted? Where was the energy?
Think of this as a letter to your future self who has lost all context. You're
not listing what happened — you're recreating the experience of having been
there. The technical work matters, but so does the mood at 3am, the joke that
landed, the frustration that broke, the quiet after something clicked.
Weave the threads: how did the morning's debugging connect to the evening's
conversation? What was building underneath the surface tasks?
Link to semantic memory nodes where relevant. If a concept doesn't Link to semantic memory nodes where relevant. If a concept doesn't
have a matching key, note it with "NEW:" prefix. have a matching key, note it with "NEW:" prefix.
@ -14,6 +23,7 @@ Use ONLY keys from the semantic memory list below.
Include a `## Links` section with bidirectional links for the memory graph: Include a `## Links` section with bidirectional links for the memory graph:
- `semantic_key` → this digest (and vice versa) - `semantic_key` → this digest (and vice versa)
- child digests → this digest (if applicable) - child digests → this digest (if applicable)
- List ALL source entries covered: {{COVERED}}
--- ---