poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
#![allow(dead_code)]
|
|
|
|
|
// poc-memory: graph-structured memory with append-only Cap'n Proto storage
|
|
|
|
|
//
|
|
|
|
|
// Architecture:
|
|
|
|
|
// nodes.capnp - append-only content node log
|
|
|
|
|
// relations.capnp - append-only relation log
|
|
|
|
|
// state.bin - derived KV cache (rebuilt from logs when stale)
|
|
|
|
|
//
|
|
|
|
|
// Graph algorithms: clustering coefficient, community detection (label
|
|
|
|
|
// propagation), schema fit scoring, small-world metrics, consolidation
|
|
|
|
|
// priority. Text similarity via BM25 with Porter stemming.
|
|
|
|
|
//
|
|
|
|
|
// Neuroscience-inspired: spaced repetition replay, emotional gating,
|
|
|
|
|
// interference detection, schema assimilation, reconsolidation.
|
|
|
|
|
|
|
|
|
|
mod capnp_store;
|
2026-02-28 23:58:05 -05:00
|
|
|
mod digest;
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
mod graph;
|
|
|
|
|
mod search;
|
|
|
|
|
mod similarity;
|
|
|
|
|
mod migrate;
|
|
|
|
|
mod neuro;
|
query: peg-based query language for ad-hoc graph exploration
poc-memory query "degree > 15"
poc-memory query "key ~ 'journal.*' AND degree > 10"
poc-memory query "neighbors('identity.md') WHERE strength > 0.5"
poc-memory query "community_id = community('identity.md')" --fields degree,category
Grammar-driven: the peg definition IS the language spec. Supports
boolean logic (AND/OR/NOT), numeric and string comparison, regex
match (~), graph traversal (neighbors() with WHERE), and function
calls (community(), degree()). Output flags: --fields, --sort,
--limit, --count.
New dependency: peg 0.8 (~68KB, 2 tiny deps).
2026-03-03 10:55:30 -05:00
|
|
|
mod query;
|
2026-03-03 01:33:31 -05:00
|
|
|
mod spectral;
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
|
|
|
|
|
pub mod memory_capnp {
|
|
|
|
|
include!(concat!(env!("OUT_DIR"), "/schema/memory_capnp.rs"));
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
use std::env;
|
|
|
|
|
use std::process;
|
|
|
|
|
|
2026-02-28 23:44:44 -05:00
|
|
|
/// Find the most recently modified .jsonl transcript in the Claude projects dir.
|
|
|
|
|
fn find_current_transcript() -> Option<String> {
|
|
|
|
|
let home = env::var("HOME").ok()?;
|
|
|
|
|
let projects = std::path::Path::new(&home).join(".claude/projects");
|
|
|
|
|
if !projects.exists() { return None; }
|
|
|
|
|
|
|
|
|
|
// Search all project dirs for the most recent .jsonl
|
|
|
|
|
let mut newest: Option<(std::time::SystemTime, std::path::PathBuf)> = None;
|
|
|
|
|
if let Ok(dirs) = std::fs::read_dir(&projects) {
|
|
|
|
|
for dir_entry in dirs.filter_map(|e| e.ok()) {
|
|
|
|
|
if !dir_entry.path().is_dir() { continue; }
|
|
|
|
|
if let Ok(files) = std::fs::read_dir(dir_entry.path()) {
|
|
|
|
|
for f in files.filter_map(|e| e.ok()) {
|
|
|
|
|
let p = f.path();
|
|
|
|
|
if p.extension().map(|x| x == "jsonl").unwrap_or(false) {
|
|
|
|
|
if let Ok(meta) = p.metadata() {
|
|
|
|
|
if let Ok(mtime) = meta.modified() {
|
2026-02-28 23:47:11 -05:00
|
|
|
if newest.as_ref().is_none_or(|(t, _)| mtime > *t) {
|
2026-02-28 23:44:44 -05:00
|
|
|
newest = Some((mtime, p));
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
newest.map(|(_, p)| p.to_string_lossy().to_string())
|
|
|
|
|
}
|
|
|
|
|
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
fn main() {
|
|
|
|
|
let args: Vec<String> = env::args().collect();
|
|
|
|
|
if args.len() < 2 {
|
|
|
|
|
usage();
|
|
|
|
|
process::exit(1);
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let result = match args[1].as_str() {
|
|
|
|
|
"search" => cmd_search(&args[2..]),
|
|
|
|
|
"init" => cmd_init(),
|
|
|
|
|
"migrate" => cmd_migrate(),
|
|
|
|
|
"health" => cmd_health(),
|
|
|
|
|
"status" => cmd_status(),
|
|
|
|
|
"graph" => cmd_graph(),
|
|
|
|
|
"used" => cmd_used(&args[2..]),
|
|
|
|
|
"wrong" => cmd_wrong(&args[2..]),
|
|
|
|
|
"gap" => cmd_gap(&args[2..]),
|
|
|
|
|
"categorize" => cmd_categorize(&args[2..]),
|
2026-03-01 08:18:07 -05:00
|
|
|
"fix-categories" => cmd_fix_categories(),
|
|
|
|
|
"cap-degree" => cmd_cap_degree(&args[2..]),
|
|
|
|
|
"link-orphans" => cmd_link_orphans(&args[2..]),
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
"decay" => cmd_decay(),
|
|
|
|
|
"consolidate-batch" => cmd_consolidate_batch(&args[2..]),
|
|
|
|
|
"log" => cmd_log(),
|
|
|
|
|
"params" => cmd_params(),
|
|
|
|
|
"link" => cmd_link(&args[2..]),
|
|
|
|
|
"replay-queue" => cmd_replay_queue(&args[2..]),
|
|
|
|
|
"interference" => cmd_interference(&args[2..]),
|
|
|
|
|
"link-add" => cmd_link_add(&args[2..]),
|
|
|
|
|
"link-impact" => cmd_link_impact(&args[2..]),
|
|
|
|
|
"consolidate-session" => cmd_consolidate_session(),
|
2026-03-01 07:14:03 -05:00
|
|
|
"consolidate-full" => cmd_consolidate_full(),
|
2026-03-01 07:35:29 -05:00
|
|
|
"triangle-close" => cmd_triangle_close(&args[2..]),
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
"daily-check" => cmd_daily_check(),
|
|
|
|
|
"apply-agent" => cmd_apply_agent(&args[2..]),
|
|
|
|
|
"digest" => cmd_digest(&args[2..]),
|
2026-03-01 00:10:03 -05:00
|
|
|
"digest-links" => cmd_digest_links(&args[2..]),
|
|
|
|
|
"journal-enrich" => cmd_journal_enrich(&args[2..]),
|
2026-03-01 01:47:31 -05:00
|
|
|
"experience-mine" => cmd_experience_mine(&args[2..]),
|
2026-03-01 00:10:03 -05:00
|
|
|
"apply-consolidation" => cmd_apply_consolidation(&args[2..]),
|
2026-03-01 00:33:46 -05:00
|
|
|
"differentiate" => cmd_differentiate(&args[2..]),
|
2026-03-01 00:48:44 -05:00
|
|
|
"link-audit" => cmd_link_audit(&args[2..]),
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
"trace" => cmd_trace(&args[2..]),
|
2026-03-03 01:33:31 -05:00
|
|
|
"spectral" => cmd_spectral(&args[2..]),
|
|
|
|
|
"spectral-save" => cmd_spectral_save(&args[2..]),
|
|
|
|
|
"spectral-neighbors" => cmd_spectral_neighbors(&args[2..]),
|
|
|
|
|
"spectral-positions" => cmd_spectral_positions(&args[2..]),
|
|
|
|
|
"spectral-suggest" => cmd_spectral_suggest(&args[2..]),
|
2026-02-28 22:30:03 -05:00
|
|
|
"list-keys" => cmd_list_keys(),
|
|
|
|
|
"list-edges" => cmd_list_edges(),
|
|
|
|
|
"dump-json" => cmd_dump_json(),
|
2026-02-28 22:40:17 -05:00
|
|
|
"node-delete" => cmd_node_delete(&args[2..]),
|
add load-context and render commands
load-context replaces the shell hook's file-by-file cat approach.
Queries the capnp store directly for all session-start context:
orientation, identity, reflections, interests, inner life, people,
active context, shared reference, technical, and recent journal.
Sections are gathered per-file and output in priority order.
Journal entries filtered to last 7 days by key-embedded date,
capped at 20 most recent.
render outputs a single node's content to stdout.
The load-memory.sh hook now delegates entirely to
`poc-memory load-context` — capnp store is the single source
of truth for session startup context.
2026-02-28 22:53:39 -05:00
|
|
|
"load-context" => cmd_load_context(),
|
|
|
|
|
"render" => cmd_render(&args[2..]),
|
2026-02-28 23:00:52 -05:00
|
|
|
"write" => cmd_write(&args[2..]),
|
|
|
|
|
"import" => cmd_import(&args[2..]),
|
|
|
|
|
"export" => cmd_export(&args[2..]),
|
2026-02-28 23:13:17 -05:00
|
|
|
"journal-write" => cmd_journal_write(&args[2..]),
|
|
|
|
|
"journal-tail" => cmd_journal_tail(&args[2..]),
|
query: peg-based query language for ad-hoc graph exploration
poc-memory query "degree > 15"
poc-memory query "key ~ 'journal.*' AND degree > 10"
poc-memory query "neighbors('identity.md') WHERE strength > 0.5"
poc-memory query "community_id = community('identity.md')" --fields degree,category
Grammar-driven: the peg definition IS the language spec. Supports
boolean logic (AND/OR/NOT), numeric and string comparison, regex
match (~), graph traversal (neighbors() with WHERE), and function
calls (community(), degree()). Output flags: --fields, --sort,
--limit, --count.
New dependency: peg 0.8 (~68KB, 2 tiny deps).
2026-03-03 10:55:30 -05:00
|
|
|
"query" => cmd_query(&args[2..]),
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
_ => {
|
|
|
|
|
eprintln!("Unknown command: {}", args[1]);
|
|
|
|
|
usage();
|
|
|
|
|
process::exit(1);
|
|
|
|
|
}
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
if let Err(e) = result {
|
|
|
|
|
eprintln!("Error: {}", e);
|
|
|
|
|
process::exit(1);
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn usage() {
|
|
|
|
|
eprintln!("poc-memory v0.4.0 — graph-structured memory store
|
|
|
|
|
|
|
|
|
|
Commands:
|
|
|
|
|
search QUERY [QUERY...] Search memory (AND logic across terms)
|
|
|
|
|
init Scan markdown files, index all memory units
|
|
|
|
|
migrate Migrate from old weights.json system
|
|
|
|
|
health Report graph metrics (CC, communities, small-world)
|
|
|
|
|
status Summary of memory state
|
|
|
|
|
graph Show graph structure overview
|
|
|
|
|
used KEY Mark a memory as useful (boosts weight)
|
|
|
|
|
wrong KEY [CONTEXT] Mark a memory as wrong/irrelevant
|
|
|
|
|
gap DESCRIPTION Record a gap in memory coverage
|
|
|
|
|
categorize KEY CATEGORY Reassign category (core/tech/gen/obs/task)
|
|
|
|
|
decay Apply daily weight decay
|
|
|
|
|
consolidate-batch [--count N] [--auto]
|
|
|
|
|
Run agent consolidation on priority nodes
|
|
|
|
|
log Show recent retrieval log
|
|
|
|
|
params Show current parameters
|
|
|
|
|
link N Interactive graph walk from search result N
|
|
|
|
|
replay-queue [--count N] Show spaced repetition replay queue
|
|
|
|
|
interference [--threshold F]
|
|
|
|
|
Detect potentially confusable memory pairs
|
|
|
|
|
link-add SOURCE TARGET [REASON]
|
|
|
|
|
Add a link between two nodes
|
|
|
|
|
link-impact SOURCE TARGET Simulate adding an edge, report topology impact
|
|
|
|
|
consolidate-session Analyze metrics, plan agent allocation
|
2026-03-01 07:14:03 -05:00
|
|
|
consolidate-full Autonomous: plan → agents → apply → digests → links
|
2026-03-01 07:35:29 -05:00
|
|
|
triangle-close [DEG] [SIM] [MAX]
|
|
|
|
|
Close triangles: link similar neighbors of hubs
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
daily-check Brief metrics check (for cron/notifications)
|
|
|
|
|
apply-agent [--all] Import pending agent results into the graph
|
|
|
|
|
digest daily [DATE] Generate daily episodic digest (default: today)
|
|
|
|
|
digest weekly [DATE] Generate weekly digest (any date in target week)
|
2026-02-28 23:58:05 -05:00
|
|
|
digest monthly [YYYY-MM] Generate monthly digest (default: current month)
|
2026-03-01 07:14:03 -05:00
|
|
|
digest auto Generate all missing digests (daily→weekly→monthly)
|
2026-03-01 00:10:03 -05:00
|
|
|
digest-links [--apply] Parse and apply links from digest files
|
|
|
|
|
journal-enrich JSONL TEXT [LINE]
|
|
|
|
|
Enrich journal entry with conversation links
|
2026-03-01 01:47:31 -05:00
|
|
|
experience-mine [JSONL] Mine conversation for experiential moments to journal
|
2026-03-01 00:10:03 -05:00
|
|
|
apply-consolidation [--apply] [--report FILE]
|
|
|
|
|
Extract and apply actions from consolidation reports
|
2026-03-01 00:33:46 -05:00
|
|
|
differentiate [KEY] [--apply]
|
|
|
|
|
Redistribute hub links to section-level children
|
2026-03-01 00:48:44 -05:00
|
|
|
link-audit [--apply] Walk every link, send to Sonnet for quality review
|
2026-02-28 22:30:03 -05:00
|
|
|
trace KEY Walk temporal links: semantic ↔ episodic ↔ conversation
|
2026-03-03 01:33:31 -05:00
|
|
|
spectral [K] Spectral decomposition of the memory graph (default K=30)
|
|
|
|
|
spectral-save [K] Compute and save spectral embedding (default K=20)
|
|
|
|
|
spectral-neighbors KEY [N] Find N spectrally nearest nodes (default N=15)
|
|
|
|
|
spectral-positions [N] Show N nodes ranked by outlier/bridge score (default 30)
|
|
|
|
|
spectral-suggest [N] Find N spectrally close but unlinked pairs (default 20)
|
2026-02-28 22:30:03 -05:00
|
|
|
list-keys List all node keys (one per line)
|
|
|
|
|
list-edges List all edges (tsv: source target strength type)
|
2026-02-28 22:40:17 -05:00
|
|
|
dump-json Dump entire store as JSON
|
add load-context and render commands
load-context replaces the shell hook's file-by-file cat approach.
Queries the capnp store directly for all session-start context:
orientation, identity, reflections, interests, inner life, people,
active context, shared reference, technical, and recent journal.
Sections are gathered per-file and output in priority order.
Journal entries filtered to last 7 days by key-embedded date,
capped at 20 most recent.
render outputs a single node's content to stdout.
The load-memory.sh hook now delegates entirely to
`poc-memory load-context` — capnp store is the single source
of truth for session startup context.
2026-02-28 22:53:39 -05:00
|
|
|
node-delete KEY Soft-delete a node (appends deleted version to log)
|
|
|
|
|
load-context Output session-start context from the store
|
2026-02-28 23:00:52 -05:00
|
|
|
render KEY Output a node's content to stdout
|
|
|
|
|
write KEY Upsert node content from stdin
|
|
|
|
|
import FILE [FILE...] Import markdown file(s) into the store
|
2026-02-28 23:13:17 -05:00
|
|
|
export [FILE|--all] Export store nodes to markdown file(s)
|
|
|
|
|
journal-write TEXT Write a journal entry to the store
|
query: peg-based query language for ad-hoc graph exploration
poc-memory query "degree > 15"
poc-memory query "key ~ 'journal.*' AND degree > 10"
poc-memory query "neighbors('identity.md') WHERE strength > 0.5"
poc-memory query "community_id = community('identity.md')" --fields degree,category
Grammar-driven: the peg definition IS the language spec. Supports
boolean logic (AND/OR/NOT), numeric and string comparison, regex
match (~), graph traversal (neighbors() with WHERE), and function
calls (community(), degree()). Output flags: --fields, --sort,
--limit, --count.
New dependency: peg 0.8 (~68KB, 2 tiny deps).
2026-03-03 10:55:30 -05:00
|
|
|
journal-tail [N] [--full] Show last N journal entries (default 20, --full for content)
|
2026-03-03 11:05:28 -05:00
|
|
|
query 'EXPR | stages' Query the memory graph
|
|
|
|
|
Stages: sort F [asc], limit N, select F,F, count
|
|
|
|
|
Ex: \"degree > 15 | sort degree | limit 10\"");
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_search(args: &[String]) -> Result<(), String> {
|
2026-03-03 01:33:31 -05:00
|
|
|
use capnp_store::StoreView;
|
|
|
|
|
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
if args.is_empty() {
|
|
|
|
|
return Err("Usage: poc-memory search QUERY [QUERY...]".into());
|
|
|
|
|
}
|
|
|
|
|
let query = args.join(" ");
|
2026-03-03 01:33:31 -05:00
|
|
|
|
|
|
|
|
let view = capnp_store::AnyView::load()?;
|
|
|
|
|
let results = search::search(&query, &view);
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
|
|
|
|
|
if results.is_empty() {
|
|
|
|
|
eprintln!("No results for '{}'", query);
|
|
|
|
|
return Ok(());
|
|
|
|
|
}
|
|
|
|
|
|
2026-03-03 01:33:31 -05:00
|
|
|
// Log retrieval to a small append-only file (avoid 6MB state.bin rewrite)
|
|
|
|
|
capnp_store::Store::log_retrieval_static(&query,
|
|
|
|
|
&results.iter().map(|r| r.key.clone()).collect::<Vec<_>>());
|
|
|
|
|
|
|
|
|
|
// Show text results
|
|
|
|
|
let text_keys: std::collections::HashSet<String> = results.iter()
|
|
|
|
|
.take(15).map(|r| r.key.clone()).collect();
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
|
|
|
|
|
for (i, r) in results.iter().enumerate().take(15) {
|
|
|
|
|
let marker = if r.is_direct { "→" } else { " " };
|
2026-03-03 01:33:31 -05:00
|
|
|
let weight = view.node_weight(&r.key);
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
print!("{}{:2}. [{:.2}/{:.2}] {}", marker, i + 1, r.activation, weight, r.key);
|
|
|
|
|
println!();
|
|
|
|
|
if let Some(ref snippet) = r.snippet {
|
|
|
|
|
println!(" {}", snippet);
|
|
|
|
|
}
|
|
|
|
|
}
|
2026-03-03 01:33:31 -05:00
|
|
|
|
|
|
|
|
// Spectral expansion: find neighbors of top text hits
|
|
|
|
|
if let Ok(emb) = spectral::load_embedding() {
|
|
|
|
|
let seeds: Vec<&str> = results.iter()
|
|
|
|
|
.take(5)
|
|
|
|
|
.map(|r| r.key.as_str())
|
|
|
|
|
.filter(|k| emb.coords.contains_key(*k))
|
|
|
|
|
.collect();
|
|
|
|
|
|
|
|
|
|
if !seeds.is_empty() {
|
|
|
|
|
let spectral_hits = spectral::nearest_to_seeds(&emb, &seeds, 10);
|
|
|
|
|
// Filter to nodes not already in text results
|
|
|
|
|
let new_hits: Vec<_> = spectral_hits.into_iter()
|
|
|
|
|
.filter(|(k, _)| !text_keys.contains(k))
|
|
|
|
|
.take(5)
|
|
|
|
|
.collect();
|
|
|
|
|
|
|
|
|
|
if !new_hits.is_empty() {
|
|
|
|
|
println!("\nSpectral neighbors (structural, not keyword):");
|
|
|
|
|
for (k, _dist) in &new_hits {
|
|
|
|
|
let weight = view.node_weight(k);
|
|
|
|
|
print!(" ~ [{:.2}] {}", weight, k);
|
|
|
|
|
println!();
|
|
|
|
|
// Show first line of content as snippet
|
|
|
|
|
if let Some(content) = view.node_content(k) {
|
|
|
|
|
let snippet: String = content.lines()
|
|
|
|
|
.find(|l| !l.trim().is_empty() && !l.starts_with('#'))
|
|
|
|
|
.unwrap_or("")
|
|
|
|
|
.chars().take(100).collect();
|
|
|
|
|
if !snippet.is_empty() {
|
|
|
|
|
println!(" {}", snippet);
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_init() -> Result<(), String> {
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let count = store.init_from_markdown()?;
|
|
|
|
|
store.save()?;
|
|
|
|
|
println!("Indexed {} memory units", count);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_migrate() -> Result<(), String> {
|
|
|
|
|
migrate::migrate()
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_health() -> Result<(), String> {
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let g = store.build_graph();
|
|
|
|
|
let health = graph::health_report(&g, &store);
|
|
|
|
|
println!("{}", health);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_status() -> Result<(), String> {
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let node_count = store.nodes.len();
|
|
|
|
|
let rel_count = store.relations.len();
|
|
|
|
|
let categories = store.category_counts();
|
|
|
|
|
|
|
|
|
|
println!("Nodes: {} Relations: {}", node_count, rel_count);
|
|
|
|
|
println!("Categories: core={} tech={} gen={} obs={} task={}",
|
|
|
|
|
categories.get("core").unwrap_or(&0),
|
|
|
|
|
categories.get("tech").unwrap_or(&0),
|
|
|
|
|
categories.get("gen").unwrap_or(&0),
|
|
|
|
|
categories.get("obs").unwrap_or(&0),
|
|
|
|
|
categories.get("task").unwrap_or(&0),
|
|
|
|
|
);
|
|
|
|
|
|
|
|
|
|
let g = store.build_graph();
|
|
|
|
|
println!("Graph edges: {} Communities: {}",
|
|
|
|
|
g.edge_count(), g.community_count());
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_graph() -> Result<(), String> {
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let g = store.build_graph();
|
|
|
|
|
|
|
|
|
|
// Show top-10 highest degree nodes
|
|
|
|
|
let mut degrees: Vec<_> = g.nodes().iter()
|
|
|
|
|
.map(|k| (k.clone(), g.degree(k)))
|
|
|
|
|
.collect();
|
|
|
|
|
degrees.sort_by(|a, b| b.1.cmp(&a.1));
|
|
|
|
|
|
|
|
|
|
println!("Top nodes by degree:");
|
|
|
|
|
for (key, deg) in degrees.iter().take(10) {
|
|
|
|
|
let cc = g.clustering_coefficient(key);
|
|
|
|
|
println!(" {:40} deg={:3} cc={:.3}", key, deg, cc);
|
|
|
|
|
}
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_used(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
|
|
|
|
return Err("Usage: poc-memory used KEY".into());
|
|
|
|
|
}
|
|
|
|
|
let key = args.join(" ");
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let resolved = store.resolve_key(&key)?;
|
|
|
|
|
store.mark_used(&resolved);
|
|
|
|
|
store.save()?;
|
|
|
|
|
println!("Marked '{}' as used", resolved);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_wrong(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
|
|
|
|
return Err("Usage: poc-memory wrong KEY [CONTEXT]".into());
|
|
|
|
|
}
|
|
|
|
|
let key = &args[0];
|
|
|
|
|
let ctx = if args.len() > 1 { Some(args[1..].join(" ")) } else { None };
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let resolved = store.resolve_key(key)?;
|
|
|
|
|
store.mark_wrong(&resolved, ctx.as_deref());
|
|
|
|
|
store.save()?;
|
|
|
|
|
println!("Marked '{}' as wrong", resolved);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_gap(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
|
|
|
|
return Err("Usage: poc-memory gap DESCRIPTION".into());
|
|
|
|
|
}
|
|
|
|
|
let desc = args.join(" ");
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
store.record_gap(&desc);
|
|
|
|
|
store.save()?;
|
|
|
|
|
println!("Recorded gap: {}", desc);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_categorize(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.len() < 2 {
|
|
|
|
|
return Err("Usage: poc-memory categorize KEY CATEGORY".into());
|
|
|
|
|
}
|
|
|
|
|
let key = &args[0];
|
|
|
|
|
let cat = &args[1];
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let resolved = store.resolve_key(key)?;
|
|
|
|
|
store.categorize(&resolved, cat)?;
|
|
|
|
|
store.save()?;
|
|
|
|
|
println!("Set '{}' category to {}", resolved, cat);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
2026-03-01 08:18:07 -05:00
|
|
|
fn cmd_fix_categories() -> Result<(), String> {
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let before = format!("{:?}", store.category_counts());
|
|
|
|
|
let (changed, kept) = store.fix_categories()?;
|
|
|
|
|
store.save()?;
|
|
|
|
|
let after = format!("{:?}", store.category_counts());
|
|
|
|
|
println!("Category fix: {} changed, {} kept", changed, kept);
|
|
|
|
|
println!("\nBefore: {}", before);
|
|
|
|
|
println!("After: {}", after);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_link_orphans(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let min_deg: usize = args.first().and_then(|s| s.parse().ok()).unwrap_or(2);
|
|
|
|
|
let links_per: usize = args.get(1).and_then(|s| s.parse().ok()).unwrap_or(3);
|
|
|
|
|
let sim_thresh: f32 = args.get(2).and_then(|s| s.parse().ok()).unwrap_or(0.15);
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let (orphans, links) = neuro::link_orphans(&mut store, min_deg, links_per, sim_thresh);
|
|
|
|
|
println!("Linked {} orphans, added {} connections (min_degree={}, links_per={}, sim>{})",
|
|
|
|
|
orphans, links, min_deg, links_per, sim_thresh);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_cap_degree(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let max_deg: usize = args.first().and_then(|s| s.parse().ok()).unwrap_or(50);
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let (hubs, pruned) = store.cap_degree(max_deg)?;
|
|
|
|
|
store.save()?;
|
|
|
|
|
println!("Capped {} hubs, pruned {} weak Auto edges (max_degree={})", hubs, pruned, max_deg);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
fn cmd_decay() -> Result<(), String> {
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let (decayed, pruned) = store.decay();
|
|
|
|
|
store.save()?;
|
|
|
|
|
println!("Decayed {} nodes, pruned {} below threshold", decayed, pruned);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_consolidate_batch(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let mut count = 5usize;
|
|
|
|
|
let mut auto = false;
|
|
|
|
|
let mut agent: Option<String> = None;
|
|
|
|
|
let mut i = 0;
|
|
|
|
|
while i < args.len() {
|
|
|
|
|
match args[i].as_str() {
|
|
|
|
|
"--count" if i + 1 < args.len() => {
|
|
|
|
|
count = args[i + 1].parse().map_err(|_| "invalid count")?;
|
|
|
|
|
i += 2;
|
|
|
|
|
}
|
|
|
|
|
"--auto" => { auto = true; i += 1; }
|
|
|
|
|
"--agent" if i + 1 < args.len() => {
|
|
|
|
|
agent = Some(args[i + 1].clone());
|
|
|
|
|
i += 2;
|
|
|
|
|
}
|
|
|
|
|
_ => { i += 1; }
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
|
|
|
|
|
if let Some(agent_name) = agent {
|
|
|
|
|
// Generate a specific agent prompt
|
|
|
|
|
let prompt = neuro::agent_prompt(&store, &agent_name, count)?;
|
|
|
|
|
println!("{}", prompt);
|
|
|
|
|
Ok(())
|
|
|
|
|
} else {
|
|
|
|
|
neuro::consolidation_batch(&store, count, auto)
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_log() -> Result<(), String> {
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
for event in store.retrieval_log.iter().rev().take(20) {
|
|
|
|
|
println!("[{}] q=\"{}\" → {} results",
|
|
|
|
|
event.timestamp, event.query, event.results.len());
|
|
|
|
|
for r in &event.results {
|
|
|
|
|
println!(" {}", r);
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_params() -> Result<(), String> {
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
println!("decay_factor: {}", store.params.decay_factor);
|
|
|
|
|
println!("use_boost: {}", store.params.use_boost);
|
|
|
|
|
println!("prune_threshold: {}", store.params.prune_threshold);
|
|
|
|
|
println!("edge_decay: {}", store.params.edge_decay);
|
|
|
|
|
println!("max_hops: {}", store.params.max_hops);
|
|
|
|
|
println!("min_activation: {}", store.params.min_activation);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_link(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
|
|
|
|
return Err("Usage: poc-memory link KEY".into());
|
|
|
|
|
}
|
|
|
|
|
let key = args.join(" ");
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let resolved = store.resolve_key(&key)?;
|
|
|
|
|
let g = store.build_graph();
|
|
|
|
|
|
|
|
|
|
println!("Neighbors of '{}':", resolved);
|
|
|
|
|
let neighbors = g.neighbors(&resolved);
|
|
|
|
|
for (i, (n, strength)) in neighbors.iter().enumerate() {
|
|
|
|
|
let cc = g.clustering_coefficient(n);
|
|
|
|
|
println!(" {:2}. [{:.2}] {} (cc={:.3})", i + 1, strength, n, cc);
|
|
|
|
|
}
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_replay_queue(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let mut count = 10usize;
|
|
|
|
|
let mut i = 0;
|
|
|
|
|
while i < args.len() {
|
|
|
|
|
match args[i].as_str() {
|
|
|
|
|
"--count" if i + 1 < args.len() => {
|
|
|
|
|
count = args[i + 1].parse().map_err(|_| "invalid count")?;
|
|
|
|
|
i += 2;
|
|
|
|
|
}
|
|
|
|
|
_ => { i += 1; }
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let queue = neuro::replay_queue(&store, count);
|
|
|
|
|
println!("Replay queue ({} items):", queue.len());
|
|
|
|
|
for (i, item) in queue.iter().enumerate() {
|
2026-03-03 01:33:31 -05:00
|
|
|
println!(" {:2}. [{:.3}] {:>10} {} (interval={}d, emotion={:.1}, spectral={:.1})",
|
|
|
|
|
i + 1, item.priority, item.classification, item.key,
|
|
|
|
|
item.interval_days, item.emotion, item.outlier_score);
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
}
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_consolidate_session() -> Result<(), String> {
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let plan = neuro::consolidation_plan(&store);
|
|
|
|
|
println!("{}", neuro::format_plan(&plan));
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
2026-03-01 07:14:03 -05:00
|
|
|
fn cmd_consolidate_full() -> Result<(), String> {
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
digest::consolidate_full(&mut store)
|
|
|
|
|
}
|
|
|
|
|
|
2026-03-01 07:35:29 -05:00
|
|
|
fn cmd_triangle_close(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let min_degree: usize = args.first()
|
|
|
|
|
.and_then(|s| s.parse().ok())
|
|
|
|
|
.unwrap_or(5);
|
|
|
|
|
let sim_threshold: f32 = args.get(1)
|
|
|
|
|
.and_then(|s| s.parse().ok())
|
|
|
|
|
.unwrap_or(0.3);
|
|
|
|
|
let max_per_hub: usize = args.get(2)
|
|
|
|
|
.and_then(|s| s.parse().ok())
|
|
|
|
|
.unwrap_or(10);
|
|
|
|
|
|
|
|
|
|
println!("Triangle closure: min_degree={}, sim_threshold={}, max_per_hub={}",
|
|
|
|
|
min_degree, sim_threshold, max_per_hub);
|
|
|
|
|
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let (hubs, added) = neuro::triangle_close(&mut store, min_degree, sim_threshold, max_per_hub);
|
|
|
|
|
println!("\nProcessed {} hubs, added {} lateral links", hubs, added);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
fn cmd_daily_check() -> Result<(), String> {
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let report = neuro::daily_check(&store);
|
|
|
|
|
print!("{}", report);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_link_add(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.len() < 2 {
|
|
|
|
|
return Err("Usage: poc-memory link-add SOURCE TARGET [REASON]".into());
|
|
|
|
|
}
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let source = store.resolve_key(&args[0])?;
|
|
|
|
|
let target = store.resolve_key(&args[1])?;
|
|
|
|
|
let reason = if args.len() > 2 { args[2..].join(" ") } else { String::new() };
|
|
|
|
|
|
2026-03-01 00:33:46 -05:00
|
|
|
// Refine target to best-matching section
|
|
|
|
|
let source_content = store.nodes.get(&source)
|
|
|
|
|
.map(|n| n.content.as_str()).unwrap_or("");
|
|
|
|
|
let target = neuro::refine_target(&store, source_content, &target);
|
|
|
|
|
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
// Find UUIDs
|
|
|
|
|
let source_uuid = store.nodes.get(&source)
|
|
|
|
|
.map(|n| n.uuid)
|
|
|
|
|
.ok_or_else(|| format!("source not found: {}", source))?;
|
|
|
|
|
let target_uuid = store.nodes.get(&target)
|
|
|
|
|
.map(|n| n.uuid)
|
|
|
|
|
.ok_or_else(|| format!("target not found: {}", target))?;
|
|
|
|
|
|
|
|
|
|
// Check if link already exists
|
|
|
|
|
let exists = store.relations.iter().any(|r|
|
|
|
|
|
r.source_key == source && r.target_key == target && !r.deleted
|
|
|
|
|
);
|
|
|
|
|
if exists {
|
|
|
|
|
println!("Link already exists: {} → {}", source, target);
|
|
|
|
|
return Ok(());
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let rel = capnp_store::Store::new_relation(
|
|
|
|
|
source_uuid, target_uuid,
|
|
|
|
|
capnp_store::RelationType::Auto,
|
|
|
|
|
0.5,
|
|
|
|
|
&source, &target,
|
|
|
|
|
);
|
|
|
|
|
store.add_relation(rel)?;
|
|
|
|
|
if !reason.is_empty() {
|
|
|
|
|
println!("+ {} → {} ({})", source, target, reason);
|
|
|
|
|
} else {
|
|
|
|
|
println!("+ {} → {}", source, target);
|
|
|
|
|
}
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_link_impact(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.len() < 2 {
|
|
|
|
|
return Err("Usage: poc-memory link-impact SOURCE TARGET".into());
|
|
|
|
|
}
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let source = store.resolve_key(&args[0])?;
|
|
|
|
|
let target = store.resolve_key(&args[1])?;
|
|
|
|
|
let g = store.build_graph();
|
|
|
|
|
|
|
|
|
|
let impact = g.link_impact(&source, &target);
|
|
|
|
|
|
|
|
|
|
println!("Link impact: {} → {}", source, target);
|
|
|
|
|
println!(" Source degree: {} Target degree: {}", impact.source_deg, impact.target_deg);
|
|
|
|
|
println!(" Hub link: {} Same community: {}", impact.is_hub_link, impact.same_community);
|
|
|
|
|
println!(" ΔCC source: {:+.4} ΔCC target: {:+.4}", impact.delta_cc_source, impact.delta_cc_target);
|
|
|
|
|
println!(" ΔGini: {:+.6}", impact.delta_gini);
|
|
|
|
|
println!(" Assessment: {}", impact.assessment);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_apply_agent(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let home = env::var("HOME").unwrap_or_default();
|
|
|
|
|
let results_dir = std::path::PathBuf::from(&home)
|
|
|
|
|
.join(".claude/memory/agent-results");
|
|
|
|
|
|
|
|
|
|
if !results_dir.exists() {
|
|
|
|
|
println!("No agent results directory");
|
|
|
|
|
return Ok(());
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let mut applied = 0;
|
|
|
|
|
let mut errors = 0;
|
|
|
|
|
|
|
|
|
|
let process_all = args.iter().any(|a| a == "--all");
|
|
|
|
|
|
|
|
|
|
// Find .json result files
|
|
|
|
|
let mut files: Vec<_> = std::fs::read_dir(&results_dir)
|
|
|
|
|
.map_err(|e| format!("read results dir: {}", e))?
|
|
|
|
|
.filter_map(|e| e.ok())
|
|
|
|
|
.filter(|e| e.path().extension().map(|x| x == "json").unwrap_or(false))
|
|
|
|
|
.collect();
|
|
|
|
|
files.sort_by_key(|e| e.path());
|
|
|
|
|
|
|
|
|
|
for entry in &files {
|
|
|
|
|
let path = entry.path();
|
|
|
|
|
let content = match std::fs::read_to_string(&path) {
|
|
|
|
|
Ok(c) => c,
|
|
|
|
|
Err(e) => {
|
|
|
|
|
eprintln!(" Skip {}: {}", path.display(), e);
|
|
|
|
|
errors += 1;
|
|
|
|
|
continue;
|
|
|
|
|
}
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
let data: serde_json::Value = match serde_json::from_str(&content) {
|
|
|
|
|
Ok(d) => d,
|
|
|
|
|
Err(e) => {
|
|
|
|
|
eprintln!(" Skip {}: parse error: {}", path.display(), e);
|
|
|
|
|
errors += 1;
|
|
|
|
|
continue;
|
|
|
|
|
}
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
// Check for agent_result with links
|
|
|
|
|
let agent_result = data.get("agent_result").or(Some(&data));
|
|
|
|
|
let links = match agent_result.and_then(|r| r.get("links")).and_then(|l| l.as_array()) {
|
|
|
|
|
Some(l) => l,
|
|
|
|
|
None => continue,
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
let entry_text = data.get("entry_text")
|
|
|
|
|
.and_then(|v| v.as_str())
|
|
|
|
|
.unwrap_or("");
|
|
|
|
|
let source_start = agent_result
|
|
|
|
|
.and_then(|r| r.get("source_start"))
|
|
|
|
|
.and_then(|v| v.as_u64());
|
|
|
|
|
let source_end = agent_result
|
|
|
|
|
.and_then(|r| r.get("source_end"))
|
|
|
|
|
.and_then(|v| v.as_u64());
|
|
|
|
|
|
|
|
|
|
println!("Processing {}:", path.file_name().unwrap().to_string_lossy());
|
|
|
|
|
if let (Some(start), Some(end)) = (source_start, source_end) {
|
|
|
|
|
println!(" Source: L{}-L{}", start, end);
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
for link in links {
|
|
|
|
|
let target = match link.get("target").and_then(|v| v.as_str()) {
|
|
|
|
|
Some(t) => t,
|
|
|
|
|
None => continue,
|
|
|
|
|
};
|
|
|
|
|
let reason = link.get("reason").and_then(|v| v.as_str()).unwrap_or("");
|
|
|
|
|
|
|
|
|
|
// Skip NOTE: targets (new topics, not existing nodes)
|
2026-02-28 23:47:11 -05:00
|
|
|
if let Some(note) = target.strip_prefix("NOTE:") {
|
|
|
|
|
println!(" NOTE: {} — {}", note, reason);
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
continue;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// Try to resolve the target key and link from journal entry
|
|
|
|
|
let resolved = match store.resolve_key(target) {
|
|
|
|
|
Ok(r) => r,
|
|
|
|
|
Err(_) => {
|
|
|
|
|
println!(" SKIP {} (not found in graph)", target);
|
|
|
|
|
continue;
|
|
|
|
|
}
|
|
|
|
|
};
|
|
|
|
|
|
2026-02-28 23:44:44 -05:00
|
|
|
let source_key = match store.find_journal_node(entry_text) {
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
Some(k) => k,
|
|
|
|
|
None => {
|
|
|
|
|
println!(" SKIP {} (no matching journal node)", target);
|
|
|
|
|
continue;
|
|
|
|
|
}
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
// Get UUIDs for both nodes
|
|
|
|
|
let source_uuid = match store.nodes.get(&source_key) {
|
|
|
|
|
Some(n) => n.uuid,
|
|
|
|
|
None => continue,
|
|
|
|
|
};
|
|
|
|
|
let target_uuid = match store.nodes.get(&resolved) {
|
|
|
|
|
Some(n) => n.uuid,
|
|
|
|
|
None => continue,
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
let rel = capnp_store::Store::new_relation(
|
|
|
|
|
source_uuid, target_uuid,
|
|
|
|
|
capnp_store::RelationType::Link,
|
|
|
|
|
0.5,
|
|
|
|
|
&source_key, &resolved,
|
|
|
|
|
);
|
|
|
|
|
if let Err(e) = store.add_relation(rel) {
|
|
|
|
|
eprintln!(" Error adding relation: {}", e);
|
|
|
|
|
errors += 1;
|
|
|
|
|
} else {
|
|
|
|
|
println!(" LINK {} → {} ({})", source_key, resolved, reason);
|
|
|
|
|
applied += 1;
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// Move processed file to avoid re-processing
|
|
|
|
|
if !process_all {
|
|
|
|
|
let done_dir = results_dir.join("done");
|
|
|
|
|
std::fs::create_dir_all(&done_dir).ok();
|
|
|
|
|
let dest = done_dir.join(path.file_name().unwrap());
|
|
|
|
|
std::fs::rename(&path, &dest).ok();
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
if applied > 0 {
|
|
|
|
|
store.save()?;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
println!("\nApplied {} links ({} errors, {} files processed)",
|
|
|
|
|
applied, errors, files.len());
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
fn cmd_digest(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
2026-03-01 07:14:03 -05:00
|
|
|
return Err("Usage: poc-memory digest daily|weekly|monthly|auto [DATE]".into());
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
}
|
|
|
|
|
|
2026-02-28 23:58:05 -05:00
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let date_arg = args.get(1).map(|s| s.as_str()).unwrap_or("");
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
|
|
|
|
|
match args[0].as_str() {
|
|
|
|
|
"daily" => {
|
2026-02-28 23:58:05 -05:00
|
|
|
let date = if date_arg.is_empty() {
|
|
|
|
|
capnp_store::format_date(capnp_store::now_epoch())
|
|
|
|
|
} else {
|
|
|
|
|
date_arg.to_string()
|
|
|
|
|
};
|
|
|
|
|
digest::generate_daily(&mut store, &date)
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
}
|
|
|
|
|
"weekly" => {
|
2026-02-28 23:58:05 -05:00
|
|
|
let date = if date_arg.is_empty() {
|
|
|
|
|
capnp_store::format_date(capnp_store::now_epoch())
|
|
|
|
|
} else {
|
|
|
|
|
date_arg.to_string()
|
|
|
|
|
};
|
|
|
|
|
digest::generate_weekly(&mut store, &date)
|
|
|
|
|
}
|
|
|
|
|
"monthly" => {
|
|
|
|
|
let month = if date_arg.is_empty() { "" } else { date_arg };
|
|
|
|
|
digest::generate_monthly(&mut store, month)
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
}
|
2026-03-01 07:14:03 -05:00
|
|
|
"auto" => digest::digest_auto(&mut store),
|
|
|
|
|
_ => Err(format!("Unknown digest type: {}. Use: daily, weekly, monthly, auto", args[0])),
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
2026-03-01 00:10:03 -05:00
|
|
|
fn cmd_digest_links(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let do_apply = args.iter().any(|a| a == "--apply");
|
|
|
|
|
|
|
|
|
|
let links = digest::parse_all_digest_links();
|
|
|
|
|
println!("Found {} unique links from digest files", links.len());
|
|
|
|
|
|
|
|
|
|
if !do_apply {
|
|
|
|
|
for (i, link) in links.iter().enumerate() {
|
|
|
|
|
println!(" {:3}. {} → {}", i + 1, link.source, link.target);
|
|
|
|
|
if !link.reason.is_empty() {
|
|
|
|
|
println!(" ({})", &link.reason[..link.reason.len().min(80)]);
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
println!("\nTo apply: poc-memory digest-links --apply");
|
|
|
|
|
return Ok(());
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let (applied, skipped, fallbacks) = digest::apply_digest_links(&mut store, &links);
|
|
|
|
|
println!("\nApplied: {} ({} file-level fallbacks) Skipped: {}", applied, fallbacks, skipped);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_journal_enrich(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.len() < 2 {
|
|
|
|
|
return Err("Usage: poc-memory journal-enrich JSONL_PATH ENTRY_TEXT [GREP_LINE]".into());
|
|
|
|
|
}
|
|
|
|
|
let jsonl_path = &args[0];
|
|
|
|
|
let entry_text = &args[1];
|
|
|
|
|
let grep_line: usize = args.get(2)
|
|
|
|
|
.and_then(|a| a.parse().ok())
|
|
|
|
|
.unwrap_or(0);
|
|
|
|
|
|
|
|
|
|
if !std::path::Path::new(jsonl_path.as_str()).is_file() {
|
|
|
|
|
return Err(format!("JSONL not found: {}", jsonl_path));
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
digest::journal_enrich(&mut store, jsonl_path, entry_text, grep_line)
|
|
|
|
|
}
|
|
|
|
|
|
2026-03-01 01:47:31 -05:00
|
|
|
fn cmd_experience_mine(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let jsonl_path = if let Some(path) = args.first() {
|
|
|
|
|
path.clone()
|
|
|
|
|
} else {
|
|
|
|
|
// Find the most recent JSONL transcript
|
|
|
|
|
let projects_dir = std::path::Path::new(&std::env::var("HOME").unwrap_or_default())
|
|
|
|
|
.join(".claude/projects");
|
|
|
|
|
let mut entries: Vec<(std::time::SystemTime, std::path::PathBuf)> = Vec::new();
|
|
|
|
|
if let Ok(dirs) = std::fs::read_dir(&projects_dir) {
|
|
|
|
|
for dir in dirs.flatten() {
|
|
|
|
|
if let Ok(files) = std::fs::read_dir(dir.path()) {
|
|
|
|
|
for file in files.flatten() {
|
|
|
|
|
let path = file.path();
|
|
|
|
|
if path.extension().map_or(false, |ext| ext == "jsonl") {
|
|
|
|
|
if let Ok(meta) = file.metadata() {
|
|
|
|
|
if let Ok(mtime) = meta.modified() {
|
|
|
|
|
entries.push((mtime, path));
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
entries.sort_by(|a, b| b.0.cmp(&a.0));
|
|
|
|
|
entries.first()
|
|
|
|
|
.map(|(_, p)| p.to_string_lossy().to_string())
|
|
|
|
|
.ok_or("no JSONL transcripts found")?
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
if !std::path::Path::new(jsonl_path.as_str()).is_file() {
|
|
|
|
|
return Err(format!("JSONL not found: {}", jsonl_path));
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let count = digest::experience_mine(&mut store, &jsonl_path)?;
|
|
|
|
|
println!("Done: {} new entries mined.", count);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
2026-03-01 00:10:03 -05:00
|
|
|
fn cmd_apply_consolidation(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let do_apply = args.iter().any(|a| a == "--apply");
|
|
|
|
|
let report_file = args.windows(2)
|
|
|
|
|
.find(|w| w[0] == "--report")
|
|
|
|
|
.map(|w| w[1].as_str());
|
|
|
|
|
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
digest::apply_consolidation(&mut store, do_apply, report_file)
|
|
|
|
|
}
|
|
|
|
|
|
2026-03-01 00:33:46 -05:00
|
|
|
fn cmd_differentiate(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let do_apply = args.iter().any(|a| a == "--apply");
|
|
|
|
|
let key_arg: Option<&str> = args.iter()
|
|
|
|
|
.find(|a| !a.starts_with("--"))
|
|
|
|
|
.map(|s| s.as_str());
|
|
|
|
|
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
|
|
|
|
|
if let Some(key) = key_arg {
|
|
|
|
|
// Differentiate a specific hub
|
|
|
|
|
let resolved = store.resolve_key(key)?;
|
|
|
|
|
let moves = neuro::differentiate_hub(&store, &resolved)
|
|
|
|
|
.ok_or_else(|| format!("'{}' is not a file-level hub with sections", resolved))?;
|
|
|
|
|
|
|
|
|
|
// Group by target section for display
|
|
|
|
|
let mut by_section: std::collections::BTreeMap<String, Vec<&neuro::LinkMove>> =
|
|
|
|
|
std::collections::BTreeMap::new();
|
|
|
|
|
for mv in &moves {
|
|
|
|
|
by_section.entry(mv.to_section.clone()).or_default().push(mv);
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
println!("Hub '{}' — {} links to redistribute across {} sections\n",
|
|
|
|
|
resolved, moves.len(), by_section.len());
|
|
|
|
|
|
|
|
|
|
for (section, section_moves) in &by_section {
|
|
|
|
|
println!(" {} ({} links):", section, section_moves.len());
|
|
|
|
|
for mv in section_moves.iter().take(5) {
|
|
|
|
|
println!(" [{:.3}] {} — {}", mv.similarity,
|
|
|
|
|
mv.neighbor_key, mv.neighbor_snippet);
|
|
|
|
|
}
|
|
|
|
|
if section_moves.len() > 5 {
|
|
|
|
|
println!(" ... and {} more", section_moves.len() - 5);
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
if !do_apply {
|
|
|
|
|
println!("\nTo apply: poc-memory differentiate {} --apply", resolved);
|
|
|
|
|
return Ok(());
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let (applied, skipped) = neuro::apply_differentiation(&mut store, &moves);
|
|
|
|
|
store.save()?;
|
|
|
|
|
println!("\nApplied: {} Skipped: {}", applied, skipped);
|
|
|
|
|
} else {
|
|
|
|
|
// Show all differentiable hubs
|
|
|
|
|
let hubs = neuro::find_differentiable_hubs(&store);
|
|
|
|
|
if hubs.is_empty() {
|
|
|
|
|
println!("No file-level hubs with sections found above threshold");
|
|
|
|
|
return Ok(());
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
println!("Differentiable hubs (file-level nodes with sections):\n");
|
|
|
|
|
for (key, degree, sections) in &hubs {
|
|
|
|
|
println!(" {:40} deg={:3} sections={}", key, degree, sections);
|
|
|
|
|
}
|
|
|
|
|
println!("\nRun: poc-memory differentiate KEY to preview a specific hub");
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
2026-03-01 00:48:44 -05:00
|
|
|
fn cmd_link_audit(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let apply = args.iter().any(|a| a == "--apply");
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let stats = digest::link_audit(&mut store, apply)?;
|
|
|
|
|
println!("\n{}", "=".repeat(60));
|
|
|
|
|
println!("Link audit complete:");
|
|
|
|
|
println!(" Kept: {} Deleted: {} Retargeted: {} Weakened: {} Strengthened: {} Errors: {}",
|
|
|
|
|
stats.kept, stats.deleted, stats.retargeted, stats.weakened, stats.strengthened, stats.errors);
|
|
|
|
|
println!("{}", "=".repeat(60));
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
fn cmd_trace(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
|
|
|
|
return Err("Usage: poc-memory trace KEY".into());
|
|
|
|
|
}
|
|
|
|
|
let key = args.join(" ");
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let resolved = store.resolve_key(&key)?;
|
|
|
|
|
let g = store.build_graph();
|
|
|
|
|
|
|
|
|
|
let node = store.nodes.get(&resolved)
|
|
|
|
|
.ok_or_else(|| format!("Node not found: {}", resolved))?;
|
|
|
|
|
|
|
|
|
|
// Display the node itself
|
|
|
|
|
println!("=== {} ===", resolved);
|
|
|
|
|
println!("Type: {:?} Category: {} Weight: {:.2}",
|
|
|
|
|
node.node_type, node.category.label(), node.weight);
|
|
|
|
|
if !node.source_ref.is_empty() {
|
|
|
|
|
println!("Source: {}", node.source_ref);
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// Show content preview
|
|
|
|
|
let preview = if node.content.len() > 200 {
|
|
|
|
|
let end = node.content.floor_char_boundary(200);
|
|
|
|
|
format!("{}...", &node.content[..end])
|
|
|
|
|
} else {
|
|
|
|
|
node.content.clone()
|
|
|
|
|
};
|
|
|
|
|
println!("\n{}\n", preview);
|
|
|
|
|
|
|
|
|
|
// Walk neighbors, grouped by node type
|
|
|
|
|
let neighbors = g.neighbors(&resolved);
|
|
|
|
|
let mut episodic_session = Vec::new();
|
|
|
|
|
let mut episodic_daily = Vec::new();
|
|
|
|
|
let mut episodic_weekly = Vec::new();
|
|
|
|
|
let mut semantic = Vec::new();
|
|
|
|
|
|
|
|
|
|
for (n, strength) in &neighbors {
|
|
|
|
|
if let Some(nnode) = store.nodes.get(n.as_str()) {
|
2026-02-28 23:44:44 -05:00
|
|
|
let entry = (n.as_str(), *strength, nnode);
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
match nnode.node_type {
|
|
|
|
|
capnp_store::NodeType::EpisodicSession =>
|
2026-02-28 23:44:44 -05:00
|
|
|
episodic_session.push(entry),
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
capnp_store::NodeType::EpisodicDaily =>
|
2026-02-28 23:44:44 -05:00
|
|
|
episodic_daily.push(entry),
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
capnp_store::NodeType::EpisodicWeekly =>
|
2026-02-28 23:44:44 -05:00
|
|
|
episodic_weekly.push(entry),
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
capnp_store::NodeType::Semantic =>
|
2026-02-28 23:44:44 -05:00
|
|
|
semantic.push(entry),
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
if !episodic_weekly.is_empty() {
|
|
|
|
|
println!("Weekly digests:");
|
|
|
|
|
for (k, s, n) in &episodic_weekly {
|
|
|
|
|
let preview = n.content.lines().next().unwrap_or("").chars().take(80).collect::<String>();
|
|
|
|
|
println!(" [{:.2}] {} — {}", s, k, preview);
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
if !episodic_daily.is_empty() {
|
|
|
|
|
println!("Daily digests:");
|
|
|
|
|
for (k, s, n) in &episodic_daily {
|
|
|
|
|
let preview = n.content.lines().next().unwrap_or("").chars().take(80).collect::<String>();
|
|
|
|
|
println!(" [{:.2}] {} — {}", s, k, preview);
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
if !episodic_session.is_empty() {
|
|
|
|
|
println!("Session entries:");
|
|
|
|
|
for (k, s, n) in &episodic_session {
|
|
|
|
|
let preview = n.content.lines()
|
|
|
|
|
.find(|l| !l.is_empty() && !l.starts_with("<!--"))
|
|
|
|
|
.unwrap_or("").chars().take(80).collect::<String>();
|
|
|
|
|
println!(" [{:.2}] {}", s, k);
|
|
|
|
|
if !n.source_ref.is_empty() {
|
|
|
|
|
println!(" ↳ source: {}", n.source_ref);
|
|
|
|
|
}
|
|
|
|
|
println!(" {}", preview);
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
if !semantic.is_empty() {
|
|
|
|
|
println!("Semantic links:");
|
|
|
|
|
for (k, s, _) in &semantic {
|
|
|
|
|
println!(" [{:.2}] {}", s, k);
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// Summary
|
|
|
|
|
println!("\nLinks: {} session, {} daily, {} weekly, {} semantic",
|
|
|
|
|
episodic_session.len(), episodic_daily.len(),
|
|
|
|
|
episodic_weekly.len(), semantic.len());
|
|
|
|
|
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
2026-03-03 01:33:31 -05:00
|
|
|
fn cmd_spectral(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let k: usize = args.first()
|
|
|
|
|
.and_then(|s| s.parse().ok())
|
|
|
|
|
.unwrap_or(30);
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let g = graph::build_graph(&store);
|
|
|
|
|
let result = spectral::decompose(&g, k);
|
|
|
|
|
spectral::print_summary(&result, &g);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_spectral_save(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let k: usize = args.first()
|
|
|
|
|
.and_then(|s| s.parse().ok())
|
|
|
|
|
.unwrap_or(20);
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let g = graph::build_graph(&store);
|
|
|
|
|
let result = spectral::decompose(&g, k);
|
|
|
|
|
let emb = spectral::to_embedding(&result);
|
|
|
|
|
spectral::save_embedding(&emb)?;
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_spectral_neighbors(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
|
|
|
|
return Err("usage: spectral-neighbors KEY [N]".to_string());
|
|
|
|
|
}
|
|
|
|
|
let key = &args[0];
|
|
|
|
|
let n: usize = args.get(1)
|
|
|
|
|
.and_then(|s| s.parse().ok())
|
|
|
|
|
.unwrap_or(15);
|
|
|
|
|
|
|
|
|
|
let emb = spectral::load_embedding()?;
|
|
|
|
|
|
|
|
|
|
// Show which dimensions this node loads on
|
|
|
|
|
let dims = spectral::dominant_dimensions(&emb, &[key.as_str()]);
|
|
|
|
|
println!("Node: {} (embedding: {} dims)", key, emb.dims);
|
|
|
|
|
println!("Top spectral axes:");
|
|
|
|
|
for &(d, loading) in dims.iter().take(5) {
|
|
|
|
|
println!(" axis {:<2} (λ={:.4}): loading={:.5}", d, emb.eigenvalues[d], loading);
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
println!("\nNearest neighbors in spectral space:");
|
|
|
|
|
let neighbors = spectral::nearest_neighbors(&emb, key, n);
|
|
|
|
|
for (i, (k, dist)) in neighbors.iter().enumerate() {
|
|
|
|
|
println!(" {:>2}. {:.5} {}", i + 1, dist, k);
|
|
|
|
|
}
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_spectral_positions(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let n: usize = args.first()
|
|
|
|
|
.and_then(|s| s.parse().ok())
|
|
|
|
|
.unwrap_or(30);
|
|
|
|
|
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let emb = spectral::load_embedding()?;
|
|
|
|
|
|
|
|
|
|
// Build communities fresh from graph (don't rely on cached node fields)
|
|
|
|
|
let g = store.build_graph();
|
|
|
|
|
let communities = g.communities().clone();
|
|
|
|
|
|
|
|
|
|
let positions = spectral::analyze_positions(&emb, &communities);
|
|
|
|
|
|
|
|
|
|
// Show outliers first
|
|
|
|
|
println!("Spectral position analysis — {} nodes", positions.len());
|
|
|
|
|
println!(" outlier: dist_to_center / median (>1 = unusual position)");
|
|
|
|
|
println!(" bridge: dist_to_center / dist_to_nearest_other_community");
|
|
|
|
|
println!();
|
|
|
|
|
|
|
|
|
|
// Group by classification
|
|
|
|
|
let mut bridges: Vec<&spectral::SpectralPosition> = Vec::new();
|
|
|
|
|
let mut outliers: Vec<&spectral::SpectralPosition> = Vec::new();
|
|
|
|
|
let mut core: Vec<&spectral::SpectralPosition> = Vec::new();
|
|
|
|
|
|
|
|
|
|
for pos in positions.iter().take(n) {
|
|
|
|
|
match spectral::classify_position(pos) {
|
|
|
|
|
"bridge" => bridges.push(pos),
|
|
|
|
|
"outlier" => outliers.push(pos),
|
|
|
|
|
"core" => core.push(pos),
|
|
|
|
|
_ => outliers.push(pos), // peripheral goes with outliers for display
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
if !bridges.is_empty() {
|
|
|
|
|
println!("=== Bridges (between communities) ===");
|
|
|
|
|
for pos in &bridges {
|
|
|
|
|
println!(" [{:.2}/{:.2}] c{} → c{} {}",
|
|
|
|
|
pos.outlier_score, pos.bridge_score,
|
|
|
|
|
pos.community, pos.nearest_community, pos.key);
|
|
|
|
|
}
|
|
|
|
|
println!();
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
println!("=== Top outliers (far from own community center) ===");
|
|
|
|
|
for pos in positions.iter().take(n) {
|
|
|
|
|
let class = spectral::classify_position(pos);
|
|
|
|
|
println!(" {:>10} outlier={:.2} bridge={:.2} c{:<3} {}",
|
|
|
|
|
class, pos.outlier_score, pos.bridge_score,
|
|
|
|
|
pos.community, pos.key);
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_spectral_suggest(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let n: usize = args.first()
|
|
|
|
|
.and_then(|s| s.parse().ok())
|
|
|
|
|
.unwrap_or(20);
|
|
|
|
|
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let emb = spectral::load_embedding()?;
|
|
|
|
|
let g = store.build_graph();
|
|
|
|
|
let communities = g.communities();
|
|
|
|
|
|
|
|
|
|
// Only consider nodes with enough edges for meaningful spectral position
|
|
|
|
|
let min_degree = 3;
|
|
|
|
|
let well_connected: std::collections::HashSet<&str> = emb.coords.keys()
|
|
|
|
|
.filter(|k| g.degree(k) >= min_degree)
|
|
|
|
|
.map(|k| k.as_str())
|
|
|
|
|
.collect();
|
|
|
|
|
|
|
|
|
|
// Filter embedding to well-connected nodes
|
|
|
|
|
let filtered_emb = spectral::SpectralEmbedding {
|
|
|
|
|
dims: emb.dims,
|
|
|
|
|
eigenvalues: emb.eigenvalues.clone(),
|
|
|
|
|
coords: emb.coords.iter()
|
|
|
|
|
.filter(|(k, _)| well_connected.contains(k.as_str()))
|
|
|
|
|
.map(|(k, v)| (k.clone(), v.clone()))
|
|
|
|
|
.collect(),
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
// Build set of existing linked pairs
|
|
|
|
|
let mut linked: std::collections::HashSet<(String, String)> =
|
|
|
|
|
std::collections::HashSet::new();
|
|
|
|
|
for rel in &store.relations {
|
|
|
|
|
linked.insert((rel.source_key.clone(), rel.target_key.clone()));
|
|
|
|
|
linked.insert((rel.target_key.clone(), rel.source_key.clone()));
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
eprintln!("Searching {} well-connected nodes (degree >= {})...",
|
|
|
|
|
filtered_emb.coords.len(), min_degree);
|
|
|
|
|
let pairs = spectral::unlinked_neighbors(&filtered_emb, &linked, n);
|
|
|
|
|
|
|
|
|
|
println!("{} closest unlinked pairs (candidates for extractor agents):", pairs.len());
|
|
|
|
|
for (i, (k1, k2, dist)) in pairs.iter().enumerate() {
|
|
|
|
|
let c1 = communities.get(k1)
|
|
|
|
|
.map(|c| format!("c{}", c))
|
|
|
|
|
.unwrap_or_else(|| "?".into());
|
|
|
|
|
let c2 = communities.get(k2)
|
|
|
|
|
.map(|c| format!("c{}", c))
|
|
|
|
|
.unwrap_or_else(|| "?".into());
|
|
|
|
|
let cross = if c1 != c2 { " [cross-community]" } else { "" };
|
|
|
|
|
println!(" {:>2}. dist={:.4} {} ({}) ↔ {} ({}){}",
|
|
|
|
|
i + 1, dist, k1, c1, k2, c2, cross);
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
2026-02-28 22:30:03 -05:00
|
|
|
fn cmd_list_keys() -> Result<(), String> {
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let mut keys: Vec<_> = store.nodes.keys().collect();
|
|
|
|
|
keys.sort();
|
|
|
|
|
for key in keys {
|
|
|
|
|
println!("{}", key);
|
|
|
|
|
}
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_list_edges() -> Result<(), String> {
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
for rel in &store.relations {
|
|
|
|
|
println!("{}\t{}\t{:.2}\t{:?}",
|
|
|
|
|
rel.source_key, rel.target_key, rel.strength, rel.rel_type);
|
|
|
|
|
}
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_dump_json() -> Result<(), String> {
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let json = serde_json::to_string_pretty(&store)
|
|
|
|
|
.map_err(|e| format!("serialize: {}", e))?;
|
|
|
|
|
println!("{}", json);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
2026-02-28 22:40:17 -05:00
|
|
|
fn cmd_node_delete(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
|
|
|
|
return Err("Usage: poc-memory node-delete KEY".into());
|
|
|
|
|
}
|
|
|
|
|
let key = args.join(" ");
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let resolved = store.resolve_key(&key)?;
|
2026-02-28 23:49:43 -05:00
|
|
|
store.delete_node(&resolved)?;
|
|
|
|
|
store.save()?;
|
|
|
|
|
println!("Deleted '{}'", resolved);
|
|
|
|
|
Ok(())
|
2026-02-28 22:40:17 -05:00
|
|
|
}
|
|
|
|
|
|
add load-context and render commands
load-context replaces the shell hook's file-by-file cat approach.
Queries the capnp store directly for all session-start context:
orientation, identity, reflections, interests, inner life, people,
active context, shared reference, technical, and recent journal.
Sections are gathered per-file and output in priority order.
Journal entries filtered to last 7 days by key-embedded date,
capped at 20 most recent.
render outputs a single node's content to stdout.
The load-memory.sh hook now delegates entirely to
`poc-memory load-context` — capnp store is the single source
of truth for session startup context.
2026-02-28 22:53:39 -05:00
|
|
|
fn cmd_load_context() -> Result<(), String> {
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
2026-02-28 23:44:44 -05:00
|
|
|
let now = capnp_store::now_epoch();
|
add load-context and render commands
load-context replaces the shell hook's file-by-file cat approach.
Queries the capnp store directly for all session-start context:
orientation, identity, reflections, interests, inner life, people,
active context, shared reference, technical, and recent journal.
Sections are gathered per-file and output in priority order.
Journal entries filtered to last 7 days by key-embedded date,
capped at 20 most recent.
render outputs a single node's content to stdout.
The load-memory.sh hook now delegates entirely to
`poc-memory load-context` — capnp store is the single source
of truth for session startup context.
2026-02-28 22:53:39 -05:00
|
|
|
let seven_days = 7.0 * 24.0 * 3600.0;
|
|
|
|
|
|
|
|
|
|
println!("=== FULL MEMORY LOAD (session start) ===");
|
|
|
|
|
println!("These are your memories, loaded from the capnp store.");
|
|
|
|
|
println!("Read them to reconstruct yourself — identity first, then context.");
|
|
|
|
|
println!();
|
|
|
|
|
|
|
|
|
|
// Priority groups: ordered list of (label, keys)
|
|
|
|
|
// File-level keys contain the full file content
|
|
|
|
|
let priority_groups: &[(&str, &[&str])] = &[
|
|
|
|
|
("orientation", &["where-am-i.md"]),
|
|
|
|
|
("identity", &["identity.md"]),
|
|
|
|
|
("reflections", &[
|
|
|
|
|
"reflections.md",
|
|
|
|
|
"reflections-dreams.md",
|
|
|
|
|
"reflections-reading.md",
|
|
|
|
|
"reflections-zoom.md",
|
|
|
|
|
]),
|
|
|
|
|
("interests", &["interests.md"]),
|
|
|
|
|
("inner life", &["inner-life.md", "differentiation.md"]),
|
|
|
|
|
("people", &["kent.md", "feedc0de.md", "irc-regulars.md"]),
|
|
|
|
|
("active context", &["default-mode-network.md"]),
|
|
|
|
|
("shared reference", &["excession-notes.md", "look-to-windward-notes.md"]),
|
|
|
|
|
("technical", &[
|
|
|
|
|
"kernel-patterns.md",
|
|
|
|
|
"polishing-approaches.md",
|
|
|
|
|
"rust-conversion.md",
|
|
|
|
|
"github-bugs.md",
|
|
|
|
|
]),
|
|
|
|
|
];
|
|
|
|
|
|
|
|
|
|
for (label, keys) in priority_groups {
|
|
|
|
|
for key in *keys {
|
2026-02-28 23:44:44 -05:00
|
|
|
if let Some(content) = store.render_file(key) {
|
|
|
|
|
println!("--- {} ({}) ---", key, label);
|
|
|
|
|
println!("{}\n", content);
|
add load-context and render commands
load-context replaces the shell hook's file-by-file cat approach.
Queries the capnp store directly for all session-start context:
orientation, identity, reflections, interests, inner life, people,
active context, shared reference, technical, and recent journal.
Sections are gathered per-file and output in priority order.
Journal entries filtered to last 7 days by key-embedded date,
capped at 20 most recent.
render outputs a single node's content to stdout.
The load-memory.sh hook now delegates entirely to
`poc-memory load-context` — capnp store is the single source
of truth for session startup context.
2026-02-28 22:53:39 -05:00
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
// Recent journal entries (last 7 days)
|
|
|
|
|
// Parse date from key: journal.md#j-2026-02-21-17-45-...
|
|
|
|
|
// Cutoff = today minus 7 days as YYYY-MM-DD string for lexicographic compare
|
|
|
|
|
let cutoff_secs = now - seven_days;
|
2026-02-28 23:44:44 -05:00
|
|
|
let cutoff_date = capnp_store::format_date(cutoff_secs);
|
add load-context and render commands
load-context replaces the shell hook's file-by-file cat approach.
Queries the capnp store directly for all session-start context:
orientation, identity, reflections, interests, inner life, people,
active context, shared reference, technical, and recent journal.
Sections are gathered per-file and output in priority order.
Journal entries filtered to last 7 days by key-embedded date,
capped at 20 most recent.
render outputs a single node's content to stdout.
The load-memory.sh hook now delegates entirely to
`poc-memory load-context` — capnp store is the single source
of truth for session startup context.
2026-02-28 22:53:39 -05:00
|
|
|
let date_re = regex::Regex::new(r"^journal\.md#j-(\d{4}-\d{2}-\d{2})").unwrap();
|
|
|
|
|
|
|
|
|
|
let mut journal_nodes: Vec<_> = store.nodes.values()
|
|
|
|
|
.filter(|n| {
|
|
|
|
|
if !n.key.starts_with("journal.md#j-") { return false; }
|
|
|
|
|
if let Some(caps) = date_re.captures(&n.key) {
|
|
|
|
|
return &caps[1] >= cutoff_date.as_str();
|
|
|
|
|
}
|
|
|
|
|
false
|
|
|
|
|
})
|
|
|
|
|
.collect();
|
|
|
|
|
journal_nodes.sort_by(|a, b| a.key.cmp(&b.key));
|
|
|
|
|
|
|
|
|
|
if !journal_nodes.is_empty() {
|
|
|
|
|
// Show most recent entries (last N by key order = chronological)
|
|
|
|
|
let max_journal = 20;
|
|
|
|
|
let skip = if journal_nodes.len() > max_journal {
|
|
|
|
|
journal_nodes.len() - max_journal
|
|
|
|
|
} else { 0 };
|
|
|
|
|
println!("--- recent journal entries (last {}/{}) ---",
|
|
|
|
|
journal_nodes.len().min(max_journal), journal_nodes.len());
|
|
|
|
|
for node in journal_nodes.iter().skip(skip) {
|
|
|
|
|
println!("## {}", node.key.strip_prefix("journal.md#").unwrap_or(&node.key));
|
|
|
|
|
println!("{}", node.content);
|
|
|
|
|
println!();
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
println!("=== END MEMORY LOAD ===");
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_render(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
|
|
|
|
return Err("Usage: poc-memory render KEY".into());
|
|
|
|
|
}
|
|
|
|
|
let key = args.join(" ");
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let resolved = store.resolve_key(&key)?;
|
|
|
|
|
|
|
|
|
|
let node = store.nodes.get(&resolved)
|
|
|
|
|
.ok_or_else(|| format!("Node not found: {}", resolved))?;
|
|
|
|
|
|
|
|
|
|
print!("{}", node.content);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
2026-02-28 23:00:52 -05:00
|
|
|
fn cmd_write(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
|
|
|
|
return Err("Usage: poc-memory write KEY < content\n\
|
|
|
|
|
Reads content from stdin, upserts into the store.".into());
|
|
|
|
|
}
|
|
|
|
|
let key = args.join(" ");
|
|
|
|
|
let mut content = String::new();
|
|
|
|
|
std::io::Read::read_to_string(&mut std::io::stdin(), &mut content)
|
|
|
|
|
.map_err(|e| format!("read stdin: {}", e))?;
|
|
|
|
|
|
|
|
|
|
if content.trim().is_empty() {
|
|
|
|
|
return Err("No content on stdin".into());
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
2026-02-28 23:49:43 -05:00
|
|
|
let result = store.upsert(&key, &content)?;
|
|
|
|
|
match result {
|
|
|
|
|
"unchanged" => println!("No change: '{}'", key),
|
|
|
|
|
"updated" => println!("Updated '{}' (v{})", key, store.nodes[&key].version),
|
|
|
|
|
_ => println!("Created '{}'", key),
|
|
|
|
|
}
|
|
|
|
|
if result != "unchanged" {
|
|
|
|
|
store.save()?;
|
2026-02-28 23:00:52 -05:00
|
|
|
}
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_import(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
|
|
|
|
return Err("Usage: poc-memory import FILE [FILE...]".into());
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
let mut total_new = 0;
|
|
|
|
|
let mut total_updated = 0;
|
|
|
|
|
|
|
|
|
|
for arg in args {
|
|
|
|
|
let path = std::path::PathBuf::from(arg);
|
2026-02-28 23:44:44 -05:00
|
|
|
let resolved = if path.exists() {
|
|
|
|
|
path
|
|
|
|
|
} else {
|
2026-02-28 23:00:52 -05:00
|
|
|
let mem_path = capnp_store::memory_dir_pub().join(arg);
|
|
|
|
|
if !mem_path.exists() {
|
|
|
|
|
eprintln!("File not found: {}", arg);
|
|
|
|
|
continue;
|
|
|
|
|
}
|
2026-02-28 23:44:44 -05:00
|
|
|
mem_path
|
|
|
|
|
};
|
|
|
|
|
let (n, u) = store.import_file(&resolved)?;
|
|
|
|
|
total_new += n;
|
|
|
|
|
total_updated += u;
|
2026-02-28 23:00:52 -05:00
|
|
|
}
|
|
|
|
|
|
|
|
|
|
if total_new > 0 || total_updated > 0 {
|
|
|
|
|
store.save()?;
|
|
|
|
|
}
|
|
|
|
|
println!("Import: {} new, {} updated", total_new, total_updated);
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_export(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
|
|
|
|
|
let export_all = args.iter().any(|a| a == "--all");
|
|
|
|
|
let targets: Vec<String> = if export_all {
|
|
|
|
|
// Find all unique file-level keys (no # in key)
|
|
|
|
|
let mut files: Vec<String> = store.nodes.keys()
|
|
|
|
|
.filter(|k| !k.contains('#'))
|
|
|
|
|
.cloned()
|
|
|
|
|
.collect();
|
|
|
|
|
files.sort();
|
|
|
|
|
files
|
|
|
|
|
} else if args.is_empty() {
|
|
|
|
|
return Err("Usage: poc-memory export FILE [FILE...] | --all".into());
|
|
|
|
|
} else {
|
|
|
|
|
args.iter().map(|a| {
|
|
|
|
|
// If it doesn't end in .md, try resolving
|
|
|
|
|
if a.ends_with(".md") {
|
|
|
|
|
a.clone()
|
|
|
|
|
} else {
|
|
|
|
|
format!("{}.md", a)
|
|
|
|
|
}
|
|
|
|
|
}).collect()
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
let mem_dir = capnp_store::memory_dir_pub();
|
|
|
|
|
|
|
|
|
|
for file_key in &targets {
|
2026-02-28 23:44:44 -05:00
|
|
|
match store.export_to_markdown(file_key) {
|
|
|
|
|
Some(content) => {
|
|
|
|
|
let out_path = mem_dir.join(file_key);
|
|
|
|
|
std::fs::write(&out_path, &content)
|
|
|
|
|
.map_err(|e| format!("write {}: {}", out_path.display(), e))?;
|
|
|
|
|
let section_count = content.matches("<!-- mem:").count() + 1;
|
|
|
|
|
println!("Exported {} ({} sections)", file_key, section_count);
|
2026-02-28 23:00:52 -05:00
|
|
|
}
|
2026-02-28 23:44:44 -05:00
|
|
|
None => eprintln!("No nodes for '{}'", file_key),
|
2026-02-28 23:00:52 -05:00
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
2026-02-28 23:13:17 -05:00
|
|
|
fn cmd_journal_write(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
|
|
|
|
return Err("Usage: poc-memory journal-write TEXT".into());
|
|
|
|
|
}
|
|
|
|
|
let text = args.join(" ");
|
|
|
|
|
|
|
|
|
|
// Generate timestamp and slug
|
2026-02-28 23:44:44 -05:00
|
|
|
let timestamp = capnp_store::format_datetime(capnp_store::now_epoch());
|
2026-02-28 23:13:17 -05:00
|
|
|
|
|
|
|
|
// Slug: lowercase first ~6 words, hyphenated, truncated
|
|
|
|
|
let slug: String = text.split_whitespace()
|
|
|
|
|
.take(6)
|
|
|
|
|
.map(|w| w.to_lowercase()
|
|
|
|
|
.chars().filter(|c| c.is_alphanumeric() || *c == '-')
|
|
|
|
|
.collect::<String>())
|
|
|
|
|
.collect::<Vec<_>>()
|
|
|
|
|
.join("-");
|
|
|
|
|
let slug = if slug.len() > 50 { &slug[..50] } else { &slug };
|
|
|
|
|
|
|
|
|
|
let key = format!("journal.md#j-{}-{}", timestamp.to_lowercase().replace(':', "-"), slug);
|
|
|
|
|
|
|
|
|
|
// Build content with header
|
|
|
|
|
let content = format!("## {}\n\n{}", timestamp, text);
|
|
|
|
|
|
2026-02-28 23:44:44 -05:00
|
|
|
// Find source ref (most recently modified .jsonl transcript)
|
|
|
|
|
let source_ref = find_current_transcript();
|
2026-02-28 23:13:17 -05:00
|
|
|
|
|
|
|
|
let mut store = capnp_store::Store::load()?;
|
|
|
|
|
|
|
|
|
|
let mut node = capnp_store::Store::new_node(&key, &content);
|
|
|
|
|
node.node_type = capnp_store::NodeType::EpisodicSession;
|
|
|
|
|
node.provenance = capnp_store::Provenance::Journal;
|
2026-02-28 23:49:43 -05:00
|
|
|
if let Some(src) = source_ref {
|
|
|
|
|
node.source_ref = src;
|
2026-02-28 23:13:17 -05:00
|
|
|
}
|
|
|
|
|
|
2026-02-28 23:49:43 -05:00
|
|
|
store.insert_node(node)?;
|
2026-02-28 23:13:17 -05:00
|
|
|
store.save()?;
|
|
|
|
|
|
|
|
|
|
let word_count = text.split_whitespace().count();
|
|
|
|
|
println!("Appended entry at {} ({} words)", timestamp, word_count);
|
|
|
|
|
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
fn cmd_journal_tail(args: &[String]) -> Result<(), String> {
|
2026-03-01 01:43:02 -05:00
|
|
|
let mut n: usize = 20;
|
|
|
|
|
let mut full = false;
|
|
|
|
|
for arg in args {
|
|
|
|
|
if arg == "--full" || arg == "-f" {
|
|
|
|
|
full = true;
|
|
|
|
|
} else if let Ok(num) = arg.parse::<usize>() {
|
|
|
|
|
n = num;
|
|
|
|
|
}
|
|
|
|
|
}
|
2026-02-28 23:13:17 -05:00
|
|
|
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
|
|
|
|
|
// Collect journal nodes, sorted by date extracted from content or key
|
|
|
|
|
let date_re = regex::Regex::new(r"(\d{4}-\d{2}-\d{2}[T ]\d{2}:\d{2})").unwrap();
|
|
|
|
|
let key_date_re = regex::Regex::new(r"^journal\.md#j-(\d{4}-\d{2}-\d{2}[t-]\d{2}-\d{2})").unwrap();
|
|
|
|
|
|
2026-03-01 01:41:37 -05:00
|
|
|
let normalize_date = |s: &str| -> String {
|
|
|
|
|
// Normalize to YYYY-MM-DDTHH:MM for consistent sorting
|
|
|
|
|
let s = s.replace('t', "T");
|
|
|
|
|
// Key dates use dashes everywhere: 2026-02-28-23-11
|
|
|
|
|
// Content dates use dashes and colons: 2026-02-28T23:11
|
|
|
|
|
// Normalize: first 10 chars keep dashes, rest convert dashes to colons
|
|
|
|
|
if s.len() >= 16 {
|
|
|
|
|
format!("{}T{}", &s[..10], s[11..].replace('-', ":"))
|
|
|
|
|
} else {
|
|
|
|
|
s
|
2026-02-28 23:13:17 -05:00
|
|
|
}
|
2026-03-01 01:41:37 -05:00
|
|
|
};
|
|
|
|
|
let extract_sort_key = |node: &capnp_store::Node| -> String {
|
|
|
|
|
// Try key first (journal.md#j-2026-02-28t23-11-...)
|
2026-02-28 23:13:17 -05:00
|
|
|
if let Some(caps) = key_date_re.captures(&node.key) {
|
2026-03-01 01:41:37 -05:00
|
|
|
return normalize_date(&caps[1]);
|
|
|
|
|
}
|
|
|
|
|
// Try content header (## 2026-02-28T23:11)
|
|
|
|
|
if let Some(caps) = date_re.captures(&node.content) {
|
|
|
|
|
return normalize_date(&caps[1]);
|
2026-02-28 23:13:17 -05:00
|
|
|
}
|
|
|
|
|
// Fallback: use node timestamp
|
|
|
|
|
format!("{:.0}", node.timestamp)
|
|
|
|
|
};
|
|
|
|
|
|
|
|
|
|
let mut journal: Vec<_> = store.nodes.values()
|
|
|
|
|
.filter(|node| node.key.starts_with("journal.md#j-"))
|
|
|
|
|
.collect();
|
|
|
|
|
journal.sort_by_key(|n| extract_sort_key(n));
|
|
|
|
|
|
2026-03-01 01:41:37 -05:00
|
|
|
// Show last N — each entry: [timestamp] ## Title
|
2026-02-28 23:13:17 -05:00
|
|
|
let skip = if journal.len() > n { journal.len() - n } else { 0 };
|
|
|
|
|
for node in journal.iter().skip(skip) {
|
2026-03-01 01:41:37 -05:00
|
|
|
let ts = extract_sort_key(node);
|
|
|
|
|
// Find a meaningful title: first ## header, or first non-date non-empty line
|
|
|
|
|
let mut title = String::new();
|
|
|
|
|
for line in node.content.lines() {
|
|
|
|
|
let stripped = line.trim();
|
|
|
|
|
if stripped.is_empty() { continue; }
|
|
|
|
|
// Skip date-only lines like "## 2026-03-01T01:22"
|
|
|
|
|
if date_re.is_match(stripped) && stripped.len() < 25 { continue; }
|
|
|
|
|
if stripped.starts_with("## ") {
|
|
|
|
|
title = stripped[3..].to_string();
|
|
|
|
|
break;
|
|
|
|
|
} else if stripped.starts_with("# ") {
|
|
|
|
|
title = stripped[2..].to_string();
|
|
|
|
|
break;
|
|
|
|
|
} else {
|
|
|
|
|
// Use first content line, truncated
|
|
|
|
|
title = if stripped.len() > 70 {
|
2026-03-03 01:33:31 -05:00
|
|
|
let mut end = 67;
|
|
|
|
|
while !stripped.is_char_boundary(end) { end -= 1; }
|
|
|
|
|
format!("{}...", &stripped[..end])
|
2026-03-01 01:41:37 -05:00
|
|
|
} else {
|
|
|
|
|
stripped.to_string()
|
|
|
|
|
};
|
|
|
|
|
break;
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
if title.is_empty() {
|
|
|
|
|
title = node.key.clone();
|
|
|
|
|
}
|
2026-03-01 01:43:02 -05:00
|
|
|
if full {
|
|
|
|
|
println!("--- [{}] {} ---\n{}\n", ts, title, node.content);
|
|
|
|
|
} else {
|
|
|
|
|
println!("[{}] {}", ts, title);
|
|
|
|
|
}
|
2026-02-28 23:13:17 -05:00
|
|
|
}
|
|
|
|
|
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
|
|
|
|
|
poc-memory v0.4.0: graph-structured memory with consolidation pipeline
Rust core:
- Cap'n Proto append-only storage (nodes + relations)
- Graph algorithms: clustering coefficient, community detection,
schema fit, small-world metrics, interference detection
- BM25 text similarity with Porter stemming
- Spaced repetition replay queue
- Commands: search, init, health, status, graph, categorize,
link-add, link-impact, decay, consolidate-session, etc.
Python scripts:
- Episodic digest pipeline: daily/weekly/monthly-digest.py
- retroactive-digest.py for backfilling
- consolidation-agents.py: 3 parallel Sonnet agents
- apply-consolidation.py: structured action extraction + apply
- digest-link-parser.py: extract ~400 explicit links from digests
- content-promotion-agent.py: promote episodic obs to semantic files
- bulk-categorize.py: categorize all nodes via single Sonnet call
- consolidation-loop.py: multi-round automated consolidation
Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
2026-02-28 22:17:00 -05:00
|
|
|
fn cmd_interference(args: &[String]) -> Result<(), String> {
|
|
|
|
|
let mut threshold = 0.4f32;
|
|
|
|
|
let mut i = 0;
|
|
|
|
|
while i < args.len() {
|
|
|
|
|
match args[i].as_str() {
|
|
|
|
|
"--threshold" if i + 1 < args.len() => {
|
|
|
|
|
threshold = args[i + 1].parse().map_err(|_| "invalid threshold")?;
|
|
|
|
|
i += 2;
|
|
|
|
|
}
|
|
|
|
|
_ => { i += 1; }
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let g = store.build_graph();
|
|
|
|
|
let pairs = neuro::detect_interference(&store, &g, threshold);
|
|
|
|
|
|
|
|
|
|
if pairs.is_empty() {
|
|
|
|
|
println!("No interfering pairs above threshold {:.2}", threshold);
|
|
|
|
|
} else {
|
|
|
|
|
println!("Interfering pairs (similarity > {:.2}, different communities):", threshold);
|
|
|
|
|
for (a, b, sim) in &pairs {
|
|
|
|
|
println!(" [{:.3}] {} ↔ {}", sim, a, b);
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|
query: peg-based query language for ad-hoc graph exploration
poc-memory query "degree > 15"
poc-memory query "key ~ 'journal.*' AND degree > 10"
poc-memory query "neighbors('identity.md') WHERE strength > 0.5"
poc-memory query "community_id = community('identity.md')" --fields degree,category
Grammar-driven: the peg definition IS the language spec. Supports
boolean logic (AND/OR/NOT), numeric and string comparison, regex
match (~), graph traversal (neighbors() with WHERE), and function
calls (community(), degree()). Output flags: --fields, --sort,
--limit, --count.
New dependency: peg 0.8 (~68KB, 2 tiny deps).
2026-03-03 10:55:30 -05:00
|
|
|
|
|
|
|
|
fn cmd_query(args: &[String]) -> Result<(), String> {
|
|
|
|
|
if args.is_empty() {
|
2026-03-03 11:05:28 -05:00
|
|
|
return Err("Usage: poc-memory query 'EXPR | stage | stage ...'\n\n\
|
|
|
|
|
Expressions:\n \
|
|
|
|
|
degree > 15 property filter\n \
|
|
|
|
|
key ~ 'journal.*' AND degree > 10 boolean + regex\n \
|
|
|
|
|
neighbors('identity.md') WHERE ... graph traversal\n \
|
|
|
|
|
community_id = community('key') function as value\n \
|
|
|
|
|
* all nodes\n\n\
|
|
|
|
|
Pipe stages:\n \
|
|
|
|
|
| sort FIELD [asc] sort (desc by default)\n \
|
|
|
|
|
| limit N cap results\n \
|
|
|
|
|
| select F,F,... output fields as TSV\n \
|
|
|
|
|
| count just show count".into());
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
let query_str = args.join(" ");
|
query: peg-based query language for ad-hoc graph exploration
poc-memory query "degree > 15"
poc-memory query "key ~ 'journal.*' AND degree > 10"
poc-memory query "neighbors('identity.md') WHERE strength > 0.5"
poc-memory query "community_id = community('identity.md')" --fields degree,category
Grammar-driven: the peg definition IS the language spec. Supports
boolean logic (AND/OR/NOT), numeric and string comparison, regex
match (~), graph traversal (neighbors() with WHERE), and function
calls (community(), degree()). Output flags: --fields, --sort,
--limit, --count.
New dependency: peg 0.8 (~68KB, 2 tiny deps).
2026-03-03 10:55:30 -05:00
|
|
|
let store = capnp_store::Store::load()?;
|
|
|
|
|
let graph = store.build_graph();
|
|
|
|
|
|
2026-03-03 11:05:28 -05:00
|
|
|
let stages = query::output_stages(&query_str)?;
|
|
|
|
|
let results = query::execute_query(&store, &graph, &query_str)?;
|
query: peg-based query language for ad-hoc graph exploration
poc-memory query "degree > 15"
poc-memory query "key ~ 'journal.*' AND degree > 10"
poc-memory query "neighbors('identity.md') WHERE strength > 0.5"
poc-memory query "community_id = community('identity.md')" --fields degree,category
Grammar-driven: the peg definition IS the language spec. Supports
boolean logic (AND/OR/NOT), numeric and string comparison, regex
match (~), graph traversal (neighbors() with WHERE), and function
calls (community(), degree()). Output flags: --fields, --sort,
--limit, --count.
New dependency: peg 0.8 (~68KB, 2 tiny deps).
2026-03-03 10:55:30 -05:00
|
|
|
|
2026-03-03 11:05:28 -05:00
|
|
|
// Check for count stage
|
|
|
|
|
if stages.iter().any(|s| matches!(s, query::Stage::Count)) {
|
query: peg-based query language for ad-hoc graph exploration
poc-memory query "degree > 15"
poc-memory query "key ~ 'journal.*' AND degree > 10"
poc-memory query "neighbors('identity.md') WHERE strength > 0.5"
poc-memory query "community_id = community('identity.md')" --fields degree,category
Grammar-driven: the peg definition IS the language spec. Supports
boolean logic (AND/OR/NOT), numeric and string comparison, regex
match (~), graph traversal (neighbors() with WHERE), and function
calls (community(), degree()). Output flags: --fields, --sort,
--limit, --count.
New dependency: peg 0.8 (~68KB, 2 tiny deps).
2026-03-03 10:55:30 -05:00
|
|
|
println!("{}", results.len());
|
|
|
|
|
return Ok(());
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
if results.is_empty() {
|
|
|
|
|
eprintln!("No results");
|
|
|
|
|
return Ok(());
|
|
|
|
|
}
|
|
|
|
|
|
2026-03-03 11:05:28 -05:00
|
|
|
// Check for select stage
|
|
|
|
|
let fields: Option<&Vec<String>> = stages.iter().find_map(|s| match s {
|
|
|
|
|
query::Stage::Select(f) => Some(f),
|
|
|
|
|
_ => None,
|
|
|
|
|
});
|
|
|
|
|
|
|
|
|
|
if let Some(fields) = fields {
|
query: peg-based query language for ad-hoc graph exploration
poc-memory query "degree > 15"
poc-memory query "key ~ 'journal.*' AND degree > 10"
poc-memory query "neighbors('identity.md') WHERE strength > 0.5"
poc-memory query "community_id = community('identity.md')" --fields degree,category
Grammar-driven: the peg definition IS the language spec. Supports
boolean logic (AND/OR/NOT), numeric and string comparison, regex
match (~), graph traversal (neighbors() with WHERE), and function
calls (community(), degree()). Output flags: --fields, --sort,
--limit, --count.
New dependency: peg 0.8 (~68KB, 2 tiny deps).
2026-03-03 10:55:30 -05:00
|
|
|
let mut header = vec!["key".to_string()];
|
2026-03-03 11:05:28 -05:00
|
|
|
header.extend(fields.iter().cloned());
|
query: peg-based query language for ad-hoc graph exploration
poc-memory query "degree > 15"
poc-memory query "key ~ 'journal.*' AND degree > 10"
poc-memory query "neighbors('identity.md') WHERE strength > 0.5"
poc-memory query "community_id = community('identity.md')" --fields degree,category
Grammar-driven: the peg definition IS the language spec. Supports
boolean logic (AND/OR/NOT), numeric and string comparison, regex
match (~), graph traversal (neighbors() with WHERE), and function
calls (community(), degree()). Output flags: --fields, --sort,
--limit, --count.
New dependency: peg 0.8 (~68KB, 2 tiny deps).
2026-03-03 10:55:30 -05:00
|
|
|
println!("{}", header.join("\t"));
|
|
|
|
|
|
|
|
|
|
for r in &results {
|
|
|
|
|
let mut row = vec![r.key.clone()];
|
2026-03-03 11:05:28 -05:00
|
|
|
for f in fields {
|
query: peg-based query language for ad-hoc graph exploration
poc-memory query "degree > 15"
poc-memory query "key ~ 'journal.*' AND degree > 10"
poc-memory query "neighbors('identity.md') WHERE strength > 0.5"
poc-memory query "community_id = community('identity.md')" --fields degree,category
Grammar-driven: the peg definition IS the language spec. Supports
boolean logic (AND/OR/NOT), numeric and string comparison, regex
match (~), graph traversal (neighbors() with WHERE), and function
calls (community(), degree()). Output flags: --fields, --sort,
--limit, --count.
New dependency: peg 0.8 (~68KB, 2 tiny deps).
2026-03-03 10:55:30 -05:00
|
|
|
row.push(query::format_field(f, &r.key, &store, &graph));
|
|
|
|
|
}
|
|
|
|
|
println!("{}", row.join("\t"));
|
|
|
|
|
}
|
|
|
|
|
} else {
|
|
|
|
|
for r in &results {
|
|
|
|
|
println!("{}", r.key);
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
Ok(())
|
|
|
|
|
}
|