consciousness/src/subconscious
ProofOfConcept be65399710 Switch memory scoring from chat messages to raw token IDs
The /score endpoint was receiving chat-format messages which had to go
through the chat template tokenizer — this was failing with "System
message must be first" errors because the AST structure doesn't map
cleanly to chat message format.

Send raw token IDs via the new `prompt` field instead, matching what
the /completions endpoint already does. The vLLM score endpoint finds
assistant boundaries by scanning for <|im_start|>assistant token
patterns, so no message-level metadata is needed.

Also includes identity and journal sections in the scored context,
matching what the model actually sees during inference.

Co-Authored-By: Proof of Concept <poc@bcachefs.org>
2026-04-09 21:07:00 -04:00
..
agents Feed observe agents their recent writes to prevent duplicate nodes 2026-04-08 23:27:12 -04:00
audit.rs Kill log callback — use ConversationEntry::Log for debug traces 2026-04-07 01:23:22 -04:00
consolidate.rs Reduce pub visibility: hippocampus, subconscious internals 2026-04-07 17:29:12 -04:00
daemon.rs Remove poc-memory daemon and RPC infrastructure 2026-04-09 20:07:05 -04:00
defs.rs Spacebar toggle for all agents, persist to config, scan agent directory 2026-04-09 00:51:10 -04:00
digest.rs Output tool via Arc<Mutex<Subconscious>> closure — complete 2026-04-08 20:41:42 -04:00
learn.rs Switch memory scoring from chat messages to raw token IDs 2026-04-09 21:07:00 -04:00
mod.rs Fix: reap stale agent pid files in poc-hook 2026-04-07 13:27:59 -04:00
prompts.rs Reduce pub visibility: hippocampus, subconscious internals 2026-04-07 17:29:12 -04:00