No description
Find a file
Kent Overstreet d76b14dfcd provenance: convert from enum to freeform string
The Provenance enum couldn't represent agents defined outside the
source code. Replace it with a Text field in the capnp schema so any
agent can write its own provenance label (e.g. "extractor:write",
"rename:tombstone") without a code change.

Schema: rename old enum fields to provenanceOld, add new Text
provenance fields. Old enum kept for reading legacy records.
Migration: from_capnp_migrate() falls back to old enum when the
new text field is empty.

Also adds `poc-memory tail` command for viewing recent store writes.

Co-Authored-By: ProofOfConcept <poc@bcachefs.org>
2026-03-11 01:19:52 -04:00
.cargo daemon: resource-gated scheduling, fact-mine integration, systemd 2026-03-05 15:31:08 -05:00
.claude stash DMN algorithm plan and connector prompt fix 2026-03-05 10:24:24 -05:00
doc move README.md to toplevel 2026-03-05 15:55:59 -05:00
docs experience-mine: per-segment dedup keys, retry backoff 2026-03-09 02:27:51 -04:00
poc-daemon split into workspace: poc-memory and poc-daemon subcrates 2026-03-08 20:43:59 -04:00
poc-memory provenance: convert from enum to freeform string 2026-03-11 01:19:52 -04:00
prompts agents: extract run_and_apply, eliminate dead split-plan.md 2026-03-10 17:51:32 -04:00
.gitignore knowledge agents: extractor, connector, challenger, observation 2026-03-03 10:56:44 -05:00
Cargo.lock counters: wire redb search hits into daemon RPC 2026-03-11 00:13:58 -04:00
Cargo.toml split into workspace: poc-memory and poc-daemon subcrates 2026-03-08 20:43:59 -04:00
README.md docs: expand README getting started section 2026-03-07 13:58:19 -05:00

poc-memory

A persistent memory and notification system for AI assistants, modelled after the human hippocampus. Combines episodic memory (timestamped journal of experiences) with an associative knowledge graph (weighted nodes connected by typed relations), and layered background processes that maintain graph health — mirroring how biological memory consolidates during rest.

Components

Component What it does Docs
Memory store Knowledge graph with episodic journal, TF-IDF search, spectral embedding, weight decay docs/memory.md
Memory daemon Background pipeline: experience-mine, fact-mine, consolidation docs/daemon.md
Notification daemon Activity-aware message routing from IRC and Telegram docs/notifications.md
Hooks Claude Code integration: memory recall and notification delivery docs/hooks.md

Getting started

Install

cargo install --path .

This builds four binaries:

  • poc-memory — memory store CLI (search, journal, consolidation)
  • memory-search — Claude Code hook for memory recall
  • poc-daemon — notification daemon (IRC, Telegram, idle tracking)
  • poc-hook — Claude Code hook for session lifecycle events

Initialize

poc-memory init

Creates the store at ~/.claude/memory/nodes.capnp and a default config at ~/.config/poc-memory/config.jsonl. Edit the config to set your name, configure context groups, and point at your projects directory.

Set up hooks

Add to ~/.claude/settings.json (see docs/hooks.md for full details):

{
  "hooks": {
    "UserPromptSubmit": [{"hooks": [
      {"type": "command", "command": "memory-search", "timeout": 10},
      {"type": "command", "command": "poc-hook", "timeout": 5}
    ]}],
    "Stop": [{"hooks": [
      {"type": "command", "command": "poc-hook", "timeout": 5}
    ]}]
  }
}

This gives your AI assistant persistent memory across sessions — relevant memories are recalled on each prompt, and experiences are extracted from transcripts after sessions end.

Start the background daemon

poc-memory daemon

The daemon watches for completed session transcripts and automatically extracts experiences and facts into the knowledge graph. See docs/daemon.md for pipeline details and diagnostics.

Basic usage

poc-memory journal-write "learned that X does Y"  # Write to journal
poc-memory search "some topic"                     # Search the graph
poc-memory status                                  # Store overview

For AI assistants

  • Search before creating: poc-memory search before writing new nodes
  • Close the feedback loop: poc-memory used KEY / poc-memory wrong KEY
  • Journal is the river, topic nodes are the delta: write experiences to the journal, pull themes into topic nodes during consolidation
  • Notifications flow automatically: IRC/Telegram messages arrive as additionalContext
  • Use daemon commands directly: poc-daemon irc send #channel msg, poc-daemon telegram send msg