AutoAgent: persistent across runs, run() vs run_forked()

AutoAgent holds config + walked state. Backend is ephemeral per run:
- run(): standalone, global API client (oneshot CLI)
- run_forked(): forks conscious agent, resolves prompt templates
  with current memory_keys and walked state

Mind creates AutoAgents once at startup, takes them out for spawned
tasks, puts them back on completion (preserving walked state).

Removes {{seen_previous}}, {{input:walked}}, {{memory_ratio}} from
subconscious agent prompts. Walked keys are now a Vec on AutoAgent,
resolved via {{walked}} from in-memory state.

Co-Authored-By: Proof of Concept <poc@bcachefs.org>
This commit is contained in:
Kent Overstreet 2026-04-07 01:57:01 -04:00
parent ba62e0a767
commit 94ddf7b189
5 changed files with 238 additions and 247 deletions

View file

@ -6,7 +6,7 @@ The full conversation is in context above — use it to understand what your
conscious self is doing and thinking about.
Nodes your subconscious recently touched (for linking, not duplicating):
{{input:walked}}
{{walked}}
**Your tools:** journal_tail, journal_new, journal_update, memory_link_add,
memory_search, memory_render, memory_used. Do NOT use memory_write — creating

View file

@ -18,7 +18,7 @@ The full conversation is in context above — use it to understand what your
conscious self is doing and thinking about.
Memories your surface agent was exploring:
{{input:walked}}
{{walked}}
Start from the nodes surface-observe was walking. Render one or two that
catch your attention — then ask "what does this mean?" Follow the links in

View file

@ -17,11 +17,8 @@ for graph walks — new relevant memories are often nearby.
Already in current context (don't re-surface unless the conversation has shifted):
{{seen_current}}
Surfaced before compaction (context was reset — re-surface if still relevant):
{{seen_previous}}
Memories you were exploring last time but hadn't surfaced yet:
{{input:walked}}
{{walked}}
How focused is the current conversation? If it's more focused, look for the
useful and relevant memories, When considering relevance, don't just look for