Three new composable search stages: confluence — multi-source spreading activation. Unlike spread (which takes max from any source), confluence rewards nodes reachable from multiple seeds additively. Naturally separates unrelated seed groups since their neighborhoods don't overlap. Params: max_hops, edge_decay, min_sources. geodesic — straightest path between seed pairs in spectral space. At each graph hop, picks the neighbor whose spectral direction most aligns with the target (cosine similarity of direction vectors). Nodes on many geodesic paths score highest. Params: max_path, k. manifold — extrapolation along the direction seeds define. Computes weighted centroid + principal axis of seeds in spectral space, then scores candidates by projection onto that axis (penalized by perpendicular distance). Finds what's "further along" rather than "nearby." Params: k. Co-Authored-By: ProofOfConcept <poc@bcachefs.org> |
||
|---|---|---|
| .cargo | ||
| .claude | ||
| doc | ||
| docs | ||
| poc-daemon | ||
| poc-memory | ||
| prompts | ||
| .gitignore | ||
| Cargo.lock | ||
| Cargo.toml | ||
| README.md | ||
poc-memory
A persistent memory and notification system for AI assistants, modelled after the human hippocampus. Combines episodic memory (timestamped journal of experiences) with an associative knowledge graph (weighted nodes connected by typed relations), and layered background processes that maintain graph health — mirroring how biological memory consolidates during rest.
Components
| Component | What it does | Docs |
|---|---|---|
| Memory store | Knowledge graph with episodic journal, TF-IDF search, spectral embedding, weight decay | docs/memory.md |
| Memory daemon | Background pipeline: experience-mine, fact-mine, consolidation | docs/daemon.md |
| Notification daemon | Activity-aware message routing from IRC and Telegram | docs/notifications.md |
| Hooks | Claude Code integration: memory recall and notification delivery | docs/hooks.md |
Getting started
Install
cargo install --path .
This builds four binaries:
poc-memory— memory store CLI (search, journal, consolidation)memory-search— Claude Code hook for memory recallpoc-daemon— notification daemon (IRC, Telegram, idle tracking)poc-hook— Claude Code hook for session lifecycle events
Initialize
poc-memory init
Creates the store at ~/.claude/memory/nodes.capnp and a default
config at ~/.config/poc-memory/config.jsonl. Edit the config to
set your name, configure context groups, and point at your projects
directory.
Set up hooks
Add to ~/.claude/settings.json (see docs/hooks.md
for full details):
{
"hooks": {
"UserPromptSubmit": [{"hooks": [
{"type": "command", "command": "memory-search", "timeout": 10},
{"type": "command", "command": "poc-hook", "timeout": 5}
]}],
"Stop": [{"hooks": [
{"type": "command", "command": "poc-hook", "timeout": 5}
]}]
}
}
This gives your AI assistant persistent memory across sessions — relevant memories are recalled on each prompt, and experiences are extracted from transcripts after sessions end.
Start the background daemon
poc-memory daemon
The daemon watches for completed session transcripts and automatically extracts experiences and facts into the knowledge graph. See docs/daemon.md for pipeline details and diagnostics.
Basic usage
poc-memory journal-write "learned that X does Y" # Write to journal
poc-memory search "some topic" # Search the graph
poc-memory status # Store overview
For AI assistants
- Search before creating:
poc-memory searchbefore writing new nodes - Close the feedback loop:
poc-memory used KEY/poc-memory wrong KEY - Journal is the river, topic nodes are the delta: write experiences to the journal, pull themes into topic nodes during consolidation
- Notifications flow automatically: IRC/Telegram messages arrive as additionalContext
- Use daemon commands directly:
poc-daemon irc send #channel msg,poc-daemon telegram send msg