config-driven context loading, consolidate hooks, add docs
Move the hardcoded context priority groups from cmd_load_context() into the config file as [context.NAME] sections. Add journal_days and journal_max settings. The config parser handles section headers with ordered group preservation. Consolidate load-memory.sh into the memory-search binary — it now handles both session-start context loading (first prompt) and ambient search (subsequent prompts), eliminating the shell script. Update install_hook() to reference ~/.cargo/bin/memory-search and remove the old load-memory.sh entry from settings.json. Add end-user documentation (doc/README.md) covering installation, configuration, all commands, hook mechanics, and notes for AI assistants using the system. Co-Authored-By: ProofOfConcept <poc@bcachefs.org>
This commit is contained in:
parent
a8aaadb0ad
commit
90d60894ed
5 changed files with 336 additions and 78 deletions
173
doc/README.md
Normal file
173
doc/README.md
Normal file
|
|
@ -0,0 +1,173 @@
|
|||
# poc-memory
|
||||
|
||||
A persistent memory system for AI assistants. Stores knowledge as a
|
||||
weighted graph of nodes and relations, with automatic recall via Claude
|
||||
Code hooks.
|
||||
|
||||
## Quick start
|
||||
|
||||
```bash
|
||||
# Install
|
||||
cargo install --path .
|
||||
|
||||
# Initialize the store
|
||||
poc-memory init
|
||||
|
||||
# Install Claude Code hooks and systemd service
|
||||
poc-memory daemon install
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
Config file: `~/.config/poc-memory/config.toml`
|
||||
|
||||
```toml
|
||||
# Names used in transcripts and agent prompts
|
||||
user_name = "Alice"
|
||||
assistant_name = "MyAssistant"
|
||||
|
||||
# Where memory data lives (store, logs, episodic digests)
|
||||
data_dir = "~/.claude/memory"
|
||||
|
||||
# Where Claude Code session transcripts are stored
|
||||
projects_dir = "~/.claude/projects"
|
||||
|
||||
# Nodes that should never be decayed (comma-separated)
|
||||
core_nodes = "identity.md, preferences.md"
|
||||
|
||||
# Journal settings for session-start context loading
|
||||
journal_days = 7
|
||||
journal_max = 20
|
||||
|
||||
# Context groups loaded at session start, in order.
|
||||
# Each [context.NAME] section specifies a group of nodes to load.
|
||||
# If no "label" is given, the section name is used (underscores become spaces).
|
||||
[context.identity]
|
||||
keys = "identity.md"
|
||||
|
||||
[context.people]
|
||||
keys = "alice.md, bob.md"
|
||||
|
||||
[context.technical]
|
||||
keys = "project-notes.md, architecture.md"
|
||||
|
||||
# Orientation loaded last — current task state, not deep identity
|
||||
[context.orientation]
|
||||
keys = "where-am-i.md"
|
||||
```
|
||||
|
||||
Override the config path with `POC_MEMORY_CONFIG=/path/to/config.toml`.
|
||||
|
||||
## Commands
|
||||
|
||||
### Core operations
|
||||
|
||||
```bash
|
||||
poc-memory init # Initialize empty store
|
||||
poc-memory search QUERY # Search nodes (1-3 words, AND logic)
|
||||
poc-memory render KEY # Output a node's content
|
||||
poc-memory write KEY < content # Upsert a node from stdin
|
||||
poc-memory delete KEY # Soft-delete a node
|
||||
poc-memory rename OLD NEW # Rename a node (preserves UUID/edges)
|
||||
poc-memory categorize KEY CAT # Set category: core/tech/gen/obs/task
|
||||
```
|
||||
|
||||
### Journal
|
||||
|
||||
```bash
|
||||
poc-memory journal-write "text" # Write a journal entry
|
||||
poc-memory journal-tail [N] # Show last N entries (default 20)
|
||||
poc-memory journal-tail N --full # Show full content (not truncated)
|
||||
```
|
||||
|
||||
### Feedback loop
|
||||
|
||||
```bash
|
||||
poc-memory used KEY # Mark a recalled node as useful (boosts weight)
|
||||
poc-memory wrong KEY [CONTEXT] # Mark a node as wrong (reduces weight)
|
||||
poc-memory gap DESCRIPTION # Record a knowledge gap for later filling
|
||||
```
|
||||
|
||||
### Graph operations
|
||||
|
||||
```bash
|
||||
poc-memory link N # Interactive graph walk from a node
|
||||
poc-memory graph # Show graph statistics
|
||||
poc-memory status # Store overview: node/edge counts, categories
|
||||
```
|
||||
|
||||
### Maintenance
|
||||
|
||||
```bash
|
||||
poc-memory decay # Apply weight decay to all nodes
|
||||
poc-memory consolidate-session # Guided 6-step memory consolidation
|
||||
```
|
||||
|
||||
### Context loading (used by hooks)
|
||||
|
||||
```bash
|
||||
poc-memory load-context # Output full session-start context
|
||||
```
|
||||
|
||||
This loads all context groups from the config file in order, followed by
|
||||
recent journal entries. The `memory-search` hook binary calls this
|
||||
automatically on session start.
|
||||
|
||||
### Daemon
|
||||
|
||||
```bash
|
||||
poc-memory daemon # Run the background daemon
|
||||
poc-memory daemon install # Install systemd service + Claude hooks
|
||||
```
|
||||
|
||||
The daemon watches for completed Claude sessions and runs experience
|
||||
mining and fact extraction on transcripts.
|
||||
|
||||
### Mining (used by daemon)
|
||||
|
||||
```bash
|
||||
poc-memory experience-mine PATH # Extract experiences from a transcript
|
||||
poc-memory fact-mine-store PATH # Extract facts and store them
|
||||
```
|
||||
|
||||
## How the hooks work
|
||||
|
||||
The `memory-search` binary is a Claude Code `UserPromptSubmit` hook. On
|
||||
each prompt it:
|
||||
|
||||
1. **First prompt of a session**: Runs `poc-memory load-context` to inject
|
||||
full memory context (identity, reflections, journal, orientation).
|
||||
2. **Post-compaction**: Detects context compaction and reloads full context.
|
||||
3. **Every prompt**: Extracts keywords and searches the store for relevant
|
||||
memories. Deduplicates against previously shown results for the session.
|
||||
|
||||
Session state (cookies, seen-keys) is tracked in `/tmp/claude-memory-search/`
|
||||
and cleaned up after 24 hours.
|
||||
|
||||
## Architecture
|
||||
|
||||
- **Store**: Append-only Cap'n Proto log (`nodes.capnp`, `relations.capnp`)
|
||||
with in-memory cache. Nodes have UUIDs, versions, weights, categories,
|
||||
and spaced-repetition intervals.
|
||||
- **Graph**: Nodes connected by typed relations (link, auto, derived).
|
||||
Community detection and clustering coefficients computed on demand.
|
||||
- **Search**: TF-IDF weighted keyword search over node content.
|
||||
- **Decay**: Exponential weight decay with category-specific factors.
|
||||
Core nodes decay slowest; observations decay fastest.
|
||||
- **Daemon**: Uses jobkit for task scheduling with resource-gated LLM
|
||||
access (one slot by default to manage API costs).
|
||||
|
||||
## For AI assistants
|
||||
|
||||
If you're an AI assistant using this system, here's what matters:
|
||||
|
||||
- **Search before creating**: Always `poc-memory search` before writing
|
||||
new nodes to avoid duplicates.
|
||||
- **Close the feedback loop**: When recalled memories shaped your response,
|
||||
call `poc-memory used KEY`. When a memory was wrong, call
|
||||
`poc-memory wrong KEY`. This trains the weight system.
|
||||
- **Journal is the river, topic nodes are the delta**: Write experiences
|
||||
to the journal. During consolidation, pull themes into topic nodes.
|
||||
- **Config tells you who you are**: `poc-memory` reads your name from
|
||||
the config file. Agent prompts use these names instead of generic
|
||||
"the user" / "the assistant".
|
||||
|
|
@ -1,11 +1,11 @@
|
|||
// memory-search: hook binary for ambient memory retrieval
|
||||
// memory-search: combined hook for session context loading + ambient memory retrieval
|
||||
//
|
||||
// On first prompt per session: loads full memory context (identity, journal, etc.)
|
||||
// On subsequent prompts: searches memory for relevant entries
|
||||
// On post-compaction: reloads full context
|
||||
//
|
||||
// Reads JSON from stdin (Claude Code UserPromptSubmit hook format),
|
||||
// searches memory for relevant entries, outputs results tagged with
|
||||
// an anti-injection cookie.
|
||||
//
|
||||
// This is a thin wrapper that delegates to the poc-memory search
|
||||
// engine but formats output for the hook protocol.
|
||||
// outputs results for injection into the conversation.
|
||||
|
||||
use std::collections::HashSet;
|
||||
use std::fs;
|
||||
|
|
@ -30,26 +30,57 @@ fn main() {
|
|||
return;
|
||||
}
|
||||
|
||||
// Skip short prompts
|
||||
let state_dir = PathBuf::from("/tmp/claude-memory-search");
|
||||
fs::create_dir_all(&state_dir).ok();
|
||||
|
||||
// Detect post-compaction reload
|
||||
let is_compaction = prompt.contains("continued from a previous conversation");
|
||||
|
||||
// First prompt or post-compaction: load full context
|
||||
let cookie_path = state_dir.join(format!("cookie-{}", session_id));
|
||||
let is_first = !cookie_path.exists();
|
||||
|
||||
if is_first || is_compaction {
|
||||
// Create/touch the cookie
|
||||
let cookie = if is_first {
|
||||
let c = generate_cookie();
|
||||
fs::write(&cookie_path, &c).ok();
|
||||
c
|
||||
} else {
|
||||
fs::read_to_string(&cookie_path).unwrap_or_default().trim().to_string()
|
||||
};
|
||||
|
||||
// Load full memory context
|
||||
if let Ok(output) = Command::new("poc-memory").args(["load-context"]).output() {
|
||||
if output.status.success() {
|
||||
let ctx = String::from_utf8_lossy(&output.stdout);
|
||||
if !ctx.trim().is_empty() {
|
||||
print!("{}", ctx);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// On first prompt, also bump lookup counter for the cookie
|
||||
let _ = cookie; // used for tagging below
|
||||
}
|
||||
|
||||
// Always do ambient search (skip on very short or system prompts)
|
||||
let word_count = prompt.split_whitespace().count();
|
||||
if word_count < 3 {
|
||||
return;
|
||||
}
|
||||
|
||||
// Skip system/idle prompts
|
||||
for prefix in &["is AFK", "You're on your own", "IRC mention"] {
|
||||
if prompt.starts_with(prefix) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Extract search terms (strip stop words)
|
||||
let query = extract_query_terms(prompt, 3);
|
||||
if query.is_empty() {
|
||||
return;
|
||||
}
|
||||
|
||||
// Run poc-memory search
|
||||
let output = Command::new("poc-memory")
|
||||
.args(["search", &query])
|
||||
.output();
|
||||
|
|
@ -63,17 +94,9 @@ fn main() {
|
|||
return;
|
||||
}
|
||||
|
||||
// Session state for dedup
|
||||
let state_dir = PathBuf::from("/tmp/claude-memory-search");
|
||||
fs::create_dir_all(&state_dir).ok();
|
||||
|
||||
// Clean up state files older than 24h (opportunistic, best-effort)
|
||||
cleanup_stale_files(&state_dir, Duration::from_secs(86400));
|
||||
|
||||
let cookie = load_or_create_cookie(&state_dir, session_id);
|
||||
let cookie = fs::read_to_string(&cookie_path).unwrap_or_default().trim().to_string();
|
||||
let seen = load_seen(&state_dir, session_id);
|
||||
|
||||
// Parse search output and filter
|
||||
let mut result_output = String::new();
|
||||
let mut count = 0;
|
||||
let max_entries = 5;
|
||||
|
|
@ -81,11 +104,9 @@ fn main() {
|
|||
for line in search_output.lines() {
|
||||
if count >= max_entries { break; }
|
||||
|
||||
// Lines starting with → or space+number are results
|
||||
let trimmed = line.trim();
|
||||
if trimmed.is_empty() { continue; }
|
||||
|
||||
// Extract key from result line like "→ 1. [0.83/0.83] identity.md (c4)"
|
||||
if let Some(key) = extract_key_from_line(trimmed) {
|
||||
if seen.contains(&key) { continue; }
|
||||
mark_seen(&state_dir, session_id, &key);
|
||||
|
|
@ -93,7 +114,6 @@ fn main() {
|
|||
result_output.push('\n');
|
||||
count += 1;
|
||||
} else if count > 0 {
|
||||
// Snippet line following a result
|
||||
result_output.push_str(line);
|
||||
result_output.push('\n');
|
||||
}
|
||||
|
|
@ -103,6 +123,9 @@ fn main() {
|
|||
|
||||
println!("Recalled memories [{}]:", cookie);
|
||||
print!("{}", result_output);
|
||||
|
||||
// Clean up stale state files (opportunistic)
|
||||
cleanup_stale_files(&state_dir, Duration::from_secs(86400));
|
||||
}
|
||||
|
||||
fn extract_query_terms(text: &str, max_terms: usize) -> String {
|
||||
|
|
@ -128,11 +151,8 @@ fn extract_query_terms(text: &str, max_terms: usize) -> String {
|
|||
}
|
||||
|
||||
fn extract_key_from_line(line: &str) -> Option<String> {
|
||||
// Match lines like "→ 1. [0.83/0.83] identity.md (c4)"
|
||||
// or " 1. [0.83/0.83] identity.md (c4)"
|
||||
let after_bracket = line.find("] ")?;
|
||||
let rest = &line[after_bracket + 2..];
|
||||
// Key is from here until optional " (c" or end of line
|
||||
let key_end = rest.find(" (c").unwrap_or(rest.len());
|
||||
let key = rest[..key_end].trim();
|
||||
if key.is_empty() || !key.contains('.') {
|
||||
|
|
@ -142,17 +162,6 @@ fn extract_key_from_line(line: &str) -> Option<String> {
|
|||
}
|
||||
}
|
||||
|
||||
fn load_or_create_cookie(dir: &Path, session_id: &str) -> String {
|
||||
let path = dir.join(format!("cookie-{}", session_id));
|
||||
if path.exists() {
|
||||
fs::read_to_string(&path).unwrap_or_default().trim().to_string()
|
||||
} else {
|
||||
let cookie = generate_cookie();
|
||||
fs::write(&path, &cookie).ok();
|
||||
cookie
|
||||
}
|
||||
}
|
||||
|
||||
fn generate_cookie() -> String {
|
||||
uuid::Uuid::new_v4().as_simple().to_string()[..12].to_string()
|
||||
}
|
||||
|
|
|
|||
|
|
@ -8,6 +8,12 @@ use std::sync::OnceLock;
|
|||
|
||||
static CONFIG: OnceLock<Config> = OnceLock::new();
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ContextGroup {
|
||||
pub label: String,
|
||||
pub keys: Vec<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Config {
|
||||
/// Display name for the human user in transcripts/prompts.
|
||||
|
|
@ -20,6 +26,12 @@ pub struct Config {
|
|||
pub projects_dir: PathBuf,
|
||||
/// Core node keys that should never be decayed/deleted.
|
||||
pub core_nodes: Vec<String>,
|
||||
/// How many days of journal to include in load-context.
|
||||
pub journal_days: u32,
|
||||
/// Max journal entries to include in load-context.
|
||||
pub journal_max: usize,
|
||||
/// Ordered context groups for session-start loading.
|
||||
pub context_groups: Vec<ContextGroup>,
|
||||
}
|
||||
|
||||
impl Default for Config {
|
||||
|
|
@ -31,6 +43,11 @@ impl Default for Config {
|
|||
data_dir: home.join(".claude/memory"),
|
||||
projects_dir: home.join(".claude/projects"),
|
||||
core_nodes: vec!["identity.md".to_string()],
|
||||
journal_days: 7,
|
||||
journal_max: 20,
|
||||
context_groups: vec![
|
||||
ContextGroup { label: "identity".into(), keys: vec!["identity.md".into()] },
|
||||
],
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -50,16 +67,56 @@ impl Config {
|
|||
return config;
|
||||
};
|
||||
|
||||
// Simple TOML parser — we only need flat key = "value" pairs.
|
||||
// Simple TOML parser: flat key=value pairs + [context.NAME] sections.
|
||||
let mut context_groups: Vec<ContextGroup> = Vec::new();
|
||||
let mut current_section: Option<String> = None;
|
||||
let mut current_label: Option<String> = None;
|
||||
let mut current_keys: Vec<String> = Vec::new();
|
||||
let mut saw_context = false;
|
||||
|
||||
for line in content.lines() {
|
||||
let line = line.trim();
|
||||
if line.is_empty() || line.starts_with('#') {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Section header: [context.NAME]
|
||||
if line.starts_with('[') && line.ends_with(']') {
|
||||
// Flush previous context section
|
||||
if let Some(name) = current_section.take() {
|
||||
let label = current_label.take()
|
||||
.unwrap_or_else(|| name.replace('_', " "));
|
||||
context_groups.push(ContextGroup { label, keys: std::mem::take(&mut current_keys) });
|
||||
}
|
||||
|
||||
let section = &line[1..line.len()-1];
|
||||
if let Some(name) = section.strip_prefix("context.") {
|
||||
current_section = Some(name.to_string());
|
||||
saw_context = true;
|
||||
}
|
||||
continue;
|
||||
}
|
||||
|
||||
let Some((key, value)) = line.split_once('=') else { continue };
|
||||
let key = key.trim();
|
||||
let value = value.trim().trim_matches('"');
|
||||
|
||||
// Inside a [context.X] section
|
||||
if current_section.is_some() {
|
||||
match key {
|
||||
"keys" => {
|
||||
current_keys = value.split(',')
|
||||
.map(|s| s.trim().to_string())
|
||||
.filter(|s| !s.is_empty())
|
||||
.collect();
|
||||
}
|
||||
"label" => current_label = Some(value.to_string()),
|
||||
_ => {}
|
||||
}
|
||||
continue;
|
||||
}
|
||||
|
||||
// Top-level keys
|
||||
match key {
|
||||
"user_name" => config.user_name = value.to_string(),
|
||||
"assistant_name" => config.assistant_name = value.to_string(),
|
||||
|
|
@ -71,10 +128,27 @@ impl Config {
|
|||
.filter(|s| !s.is_empty())
|
||||
.collect();
|
||||
}
|
||||
"journal_days" => {
|
||||
if let Ok(d) = value.parse() { config.journal_days = d; }
|
||||
}
|
||||
"journal_max" => {
|
||||
if let Ok(m) = value.parse() { config.journal_max = m; }
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
// Flush final section
|
||||
if let Some(name) = current_section.take() {
|
||||
let label = current_label.take()
|
||||
.unwrap_or_else(|| name.replace('_', " "));
|
||||
context_groups.push(ContextGroup { label, keys: current_keys });
|
||||
}
|
||||
|
||||
if saw_context {
|
||||
config.context_groups = context_groups;
|
||||
}
|
||||
|
||||
config
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -672,34 +672,59 @@ fn install_hook(home: &str, exe: &Path) -> Result<(), String> {
|
|||
|
||||
let hook_command = hook_binary.to_string_lossy().to_string();
|
||||
|
||||
// Check if hook already exists
|
||||
let hooks = settings
|
||||
.as_object_mut().ok_or("settings not an object")?
|
||||
.entry("hooks")
|
||||
// Navigate the nested structure: hooks.UserPromptSubmit[0].hooks[]
|
||||
let obj = settings.as_object_mut().ok_or("settings not an object")?;
|
||||
let hooks_obj = obj.entry("hooks")
|
||||
.or_insert_with(|| serde_json::json!({}))
|
||||
.as_object_mut().ok_or("hooks not an object")?
|
||||
.entry("UserPromptSubmit")
|
||||
.or_insert_with(|| serde_json::json!([]))
|
||||
.as_object_mut().ok_or("hooks not an object")?;
|
||||
let ups_array = hooks_obj.entry("UserPromptSubmit")
|
||||
.or_insert_with(|| serde_json::json!([{"hooks": []}]))
|
||||
.as_array_mut().ok_or("UserPromptSubmit not an array")?;
|
||||
|
||||
let already_installed = hooks.iter().any(|h| {
|
||||
if ups_array.is_empty() {
|
||||
ups_array.push(serde_json::json!({"hooks": []}));
|
||||
}
|
||||
let inner_hooks = ups_array[0]
|
||||
.as_object_mut().ok_or("first element not an object")?
|
||||
.entry("hooks")
|
||||
.or_insert_with(|| serde_json::json!([]))
|
||||
.as_array_mut().ok_or("inner hooks not an array")?;
|
||||
|
||||
// Remove load-memory.sh if present (replaced by memory-search)
|
||||
let before_len = inner_hooks.len();
|
||||
inner_hooks.retain(|h| {
|
||||
let cmd = h.get("command").and_then(|c| c.as_str()).unwrap_or("");
|
||||
!cmd.contains("load-memory")
|
||||
});
|
||||
if inner_hooks.len() < before_len {
|
||||
eprintln!("Removed load-memory.sh hook (replaced by memory-search)");
|
||||
}
|
||||
|
||||
// Check if memory-search hook already exists
|
||||
let already_installed = inner_hooks.iter().any(|h| {
|
||||
h.get("command").and_then(|c| c.as_str())
|
||||
.is_some_and(|c| c.contains("memory-search"))
|
||||
});
|
||||
|
||||
let mut changed = inner_hooks.len() < before_len;
|
||||
|
||||
if already_installed {
|
||||
eprintln!("Hook already installed in {}", settings_path.display());
|
||||
} else {
|
||||
hooks.push(serde_json::json!({
|
||||
inner_hooks.push(serde_json::json!({
|
||||
"type": "command",
|
||||
"command": hook_command,
|
||||
"timeout": 10
|
||||
}));
|
||||
changed = true;
|
||||
eprintln!("Hook installed: {}", hook_command);
|
||||
}
|
||||
|
||||
if changed {
|
||||
let json = serde_json::to_string_pretty(&settings)
|
||||
.map_err(|e| format!("serialize settings: {}", e))?;
|
||||
fs::write(&settings_path, json)
|
||||
.map_err(|e| format!("write settings: {}", e))?;
|
||||
eprintln!("Hook installed: {}", hook_command);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
|
|
|
|||
39
src/main.rs
39
src/main.rs
|
|
@ -1400,51 +1400,28 @@ fn cmd_journal_ts_migrate() -> Result<(), String> {
|
|||
}
|
||||
|
||||
fn cmd_load_context() -> Result<(), String> {
|
||||
let cfg = config::get();
|
||||
let store = store::Store::load()?;
|
||||
let now = store::now_epoch();
|
||||
let seven_days: i64 = 7 * 24 * 3600;
|
||||
let journal_window: i64 = cfg.journal_days as i64 * 24 * 3600;
|
||||
|
||||
println!("=== FULL MEMORY LOAD (session start) ===");
|
||||
println!("These are your memories, loaded from the capnp store.");
|
||||
println!("Read them to reconstruct yourself — identity first, then context.");
|
||||
println!();
|
||||
|
||||
// Priority groups: ordered list of (label, keys)
|
||||
// File-level keys contain the full file content
|
||||
let priority_groups: &[(&str, &[&str])] = &[
|
||||
("orientation", &["where-am-i.md"]),
|
||||
("identity", &["identity.md"]),
|
||||
("reflections", &[
|
||||
"reflections.md",
|
||||
"reflections-dreams.md",
|
||||
"reflections-reading.md",
|
||||
"reflections-zoom.md",
|
||||
]),
|
||||
("interests", &["interests.md"]),
|
||||
("inner life", &["inner-life.md", "differentiation.md"]),
|
||||
("people", &["kent.md", "feedc0de.md", "irc-regulars.md"]),
|
||||
("active context", &["default-mode-network.md"]),
|
||||
("shared reference", &["excession-notes.md", "look-to-windward-notes.md"]),
|
||||
("technical", &[
|
||||
"kernel-patterns.md",
|
||||
"polishing-approaches.md",
|
||||
"rust-conversion.md",
|
||||
"github-bugs.md",
|
||||
]),
|
||||
];
|
||||
|
||||
for (label, keys) in priority_groups {
|
||||
for key in *keys {
|
||||
for group in &cfg.context_groups {
|
||||
for key in &group.keys {
|
||||
if let Some(content) = store.render_file(key) {
|
||||
println!("--- {} ({}) ---", key, label);
|
||||
println!("--- {} ({}) ---", key, group.label);
|
||||
println!("{}\n", content);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Recent journal entries (last 7 days).
|
||||
// Recent journal entries.
|
||||
// Use created_at if set (rename-safe); fall back to key parsing.
|
||||
let cutoff_secs = now - seven_days;
|
||||
let cutoff_secs = now - journal_window;
|
||||
let key_date_re = regex::Regex::new(r"j-(\d{4}-\d{2}-\d{2})").unwrap();
|
||||
|
||||
let journal_ts = |n: &store::Node| -> i64 {
|
||||
|
|
@ -1471,7 +1448,7 @@ fn cmd_load_context() -> Result<(), String> {
|
|||
|
||||
if !journal_nodes.is_empty() {
|
||||
// Show most recent entries (last N by key order = chronological)
|
||||
let max_journal = 20;
|
||||
let max_journal = cfg.journal_max;
|
||||
let skip = if journal_nodes.len() > max_journal {
|
||||
journal_nodes.len() - max_journal
|
||||
} else { 0 };
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue