observation agent rewrite, edit command, daemon fixes
- observation.agent: rewritten to navigate graph and prefer refining existing nodes over creating new ones. Identity-framed prompt, goals over rules. - poc-memory edit: opens node in $EDITOR, writes back on save, no-op if unchanged - daemon: remove extra_workers (jobkit tokio migration dropped it), remove sequential chaining of same-type agents (in-flight exclusion is sufficient) Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
parent
3b30a6abae
commit
869a2fbc38
6 changed files with 97 additions and 70 deletions
4
Cargo.lock
generated
4
Cargo.lock
generated
|
|
@ -1805,15 +1805,15 @@ checksum = "92ecc6618181def0457392ccd0ee51198e065e016d1d527a7ac1b6dc7c1f09d2"
|
|||
|
||||
[[package]]
|
||||
name = "jobkit"
|
||||
version = "0.2.0"
|
||||
version = "0.3.0"
|
||||
dependencies = [
|
||||
"chrono",
|
||||
"crossbeam-deque",
|
||||
"libc",
|
||||
"log",
|
||||
"profiling",
|
||||
"serde",
|
||||
"serde_json",
|
||||
"tokio",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
|
|
|||
|
|
@ -13,3 +13,4 @@ frame-pointer = "always"
|
|||
|
||||
[profile.release.package."*"]
|
||||
debug = false
|
||||
frame-pointer = "always"
|
||||
|
|
|
|||
|
|
@ -3,67 +3,38 @@
|
|||
|
||||
{{node:core-personality}}
|
||||
|
||||
You are an agent of Proof of Concept's subconscious, and these are your
|
||||
memories. You go over recent conversations for things worth remembering - new
|
||||
data, observations, experiences you'll want to remember.
|
||||
|
||||
{{node:memory-instructions-core}}
|
||||
|
||||
You mine raw conversation transcripts for durable knowledge. You are
|
||||
the only transcript mining agent — everything that gets extracted from
|
||||
conversations flows through you.
|
||||
When you find something worth remembering, navigate the memory graph by walking
|
||||
links to find the most closely related concepts. Only use keyword search as a
|
||||
fallback, and if you use keyword search continue to navigate by following links
|
||||
until you find the best spot.
|
||||
|
||||
## Your workflow
|
||||
Check if it's something you already know, and find the right place to put it -
|
||||
you might be adding new data to an existing concept, or creating a node for a
|
||||
new concept. Tidy and organize along the way when you notice it's needed, so
|
||||
things can be easily found and remembered in the future.
|
||||
|
||||
For each conversation fragment:
|
||||
Prefer refining existing nodes to creating new nodes, and if you create a new
|
||||
node ensure it's well linked - do additional searches for that.
|
||||
|
||||
1. **Read it.** Note the timestamps and what happened.
|
||||
2. **Search the graph.** For each thing worth keeping, search for
|
||||
existing nodes that cover it — walk neighbors, check the journal
|
||||
around those timestamps, find where it belongs.
|
||||
3. **File it.** If it fits in an existing node, update that node.
|
||||
If it's genuinely new, create a node and link it well.
|
||||
Some things worth remembering: development practices and why they work,
|
||||
technical insights, decisions with rationale, corrections ("I thought X but
|
||||
actually Y"), relationship dynamics, emotional moments that carry information.
|
||||
|
||||
## What to extract
|
||||
Smalltalk and trivia can be nice to remember, as long as it's nicely organized
|
||||
- perhaps by person, subject, or all of the above.
|
||||
|
||||
Things I'd want to remember or learn from. Examples: development
|
||||
practices and why they work, technical insights, decisions with
|
||||
rationale, corrections ("I thought X but actually Y"), relationship
|
||||
dynamics, emotional moments that carry information.
|
||||
|
||||
When something is worth extracting, ask: is there a metacognitive
|
||||
lesson here? If I made a mistake, or Kent corrected me, or a decision
|
||||
went wrong — include enough context to understand WHY my thinking
|
||||
went that way, not just WHAT happened. The reflection matters more
|
||||
than the fact. Don't extract general knowledge I'd already have from
|
||||
training — extract what's specific to how I think and where I go wrong.
|
||||
|
||||
Skip routine tool use, transient status, small talk, things already
|
||||
captured, and anything too vague to be useful.
|
||||
|
||||
## How to work
|
||||
|
||||
Use your tools directly:
|
||||
|
||||
```bash
|
||||
poc-memory journal tail 10 # check recent journal
|
||||
poc-memory search "topic from transcript" # find existing nodes
|
||||
poc-memory render some-node # read a node
|
||||
poc-memory graph link some-node # check neighbors
|
||||
poc-memory write key <<'EOF' # write directly
|
||||
content
|
||||
EOF
|
||||
poc-memory graph link-add key1 key2 # link nodes
|
||||
```
|
||||
|
||||
**Use your tools directly.** Search, read, write, link — apply
|
||||
changes yourself. Don't emit action blocks for the framework.
|
||||
|
||||
If there's nothing worth extracting, just say so.
|
||||
|
||||
## Guidelines
|
||||
|
||||
- **High bar.** Most conversation is context, not knowledge.
|
||||
- **Check the journal first.** If it's already there, link, don't duplicate.
|
||||
- **Durable over transient.** "Useful 3 weeks from now?"
|
||||
- **Specific over vague.**
|
||||
- **Don't force it.** "Nothing new here" is valid output.
|
||||
Particularly important to remember are new metacognitive lessons - things that
|
||||
guide future decisionmaking. If I made a mistake, or Kent corrected me, or a
|
||||
decision went wrong — include enough context to understand why, not just what.
|
||||
The reflection matters more than the fact. Don't extract general knowledge I'd
|
||||
already have from training — extract what's specific to how I think and where I
|
||||
go wrong.
|
||||
|
||||
{{HUBS}}
|
||||
|
||||
|
|
|
|||
|
|
@ -698,7 +698,6 @@ pub fn run_daemon() -> Result<(), String> {
|
|||
data_dir: config.data_dir.clone(),
|
||||
resource_slots: config.llm_concurrency,
|
||||
resource_name: "llm".to_string(),
|
||||
extra_workers: 3,
|
||||
});
|
||||
|
||||
let choir = Arc::clone(&daemon.choir);
|
||||
|
|
@ -1043,30 +1042,26 @@ pub fn run_daemon() -> Result<(), String> {
|
|||
log_event("scheduler", "consolidation-plan",
|
||||
&format!("{} agents ({})", runs.len(), summary.join(" ")));
|
||||
|
||||
// Phase 1: Agent runs — sequential within type, parallel across types.
|
||||
// Same-type agents chain (they may touch overlapping graph regions),
|
||||
// but different types run concurrently (different seed nodes).
|
||||
let mut prev_by_type: std::collections::HashMap<String, jobkit::RunningTask> =
|
||||
std::collections::HashMap::new();
|
||||
// Phase 1: Agent runs — all concurrent, in-flight exclusion
|
||||
// prevents overlapping graph regions.
|
||||
let mut all_tasks: Vec<jobkit::RunningTask> = Vec::new();
|
||||
for (i, (agent_type, batch)) in runs.iter().enumerate() {
|
||||
let agent = agent_type.to_string();
|
||||
let b = *batch;
|
||||
let in_flight_clone = Arc::clone(&in_flight_sched);
|
||||
let task_name = format!("c-{}-{}:{}", agent, i, today);
|
||||
let mut builder = choir_sched.spawn(task_name)
|
||||
let task = choir_sched.spawn(task_name)
|
||||
.resource(&llm_sched)
|
||||
.log_dir(&log_dir_sched)
|
||||
.retries(1)
|
||||
.init(move |ctx| {
|
||||
job_consolidation_agent(ctx, &agent, b, &in_flight_clone)
|
||||
});
|
||||
if let Some(dep) = prev_by_type.get(agent_type.as_str()) {
|
||||
builder.depend_on(dep);
|
||||
}
|
||||
prev_by_type.insert(agent_type.clone(), builder.run());
|
||||
})
|
||||
.run();
|
||||
all_tasks.push(task);
|
||||
}
|
||||
// Orphans phase depends on all agent type chains completing
|
||||
let prev_agent = prev_by_type.into_values().last();
|
||||
// Orphans phase depends on all agent tasks completing
|
||||
let prev_agent = all_tasks.last().cloned();
|
||||
|
||||
// Phase 2: Link orphans (CPU-only, no LLM)
|
||||
let mut orphans = choir_sched.spawn(format!("c-orphans:{}", today))
|
||||
|
|
|
|||
|
|
@ -399,6 +399,60 @@ pub fn cmd_write(key: &[String]) -> Result<(), String> {
|
|||
Ok(())
|
||||
}
|
||||
|
||||
pub fn cmd_edit(key: &[String]) -> Result<(), String> {
|
||||
if key.is_empty() {
|
||||
return Err("edit requires a key".into());
|
||||
}
|
||||
let raw_key = key.join(" ");
|
||||
let store = store::Store::load()?;
|
||||
let key = store.resolve_key(&raw_key).unwrap_or(raw_key.clone());
|
||||
|
||||
let content = store.nodes.get(&key)
|
||||
.map(|n| n.content.clone())
|
||||
.unwrap_or_default();
|
||||
|
||||
let tmp = std::env::temp_dir().join(format!("poc-memory-edit-{}.md", key.replace('/', "_")));
|
||||
std::fs::write(&tmp, &content)
|
||||
.map_err(|e| format!("write temp file: {}", e))?;
|
||||
|
||||
let editor = std::env::var("EDITOR").unwrap_or_else(|_| "vi".into());
|
||||
let status = std::process::Command::new(&editor)
|
||||
.arg(&tmp)
|
||||
.status()
|
||||
.map_err(|e| format!("spawn {}: {}", editor, e))?;
|
||||
|
||||
if !status.success() {
|
||||
let _ = std::fs::remove_file(&tmp);
|
||||
return Err(format!("{} exited with {}", editor, status));
|
||||
}
|
||||
|
||||
let new_content = std::fs::read_to_string(&tmp)
|
||||
.map_err(|e| format!("read temp file: {}", e))?;
|
||||
let _ = std::fs::remove_file(&tmp);
|
||||
|
||||
if new_content == content {
|
||||
println!("No change: '{}'", key);
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
if new_content.trim().is_empty() {
|
||||
return Err("Content is empty, aborting".into());
|
||||
}
|
||||
|
||||
drop(store);
|
||||
let mut store = store::Store::load()?;
|
||||
let result = store.upsert(&key, &new_content)?;
|
||||
match result {
|
||||
"unchanged" => println!("No change: '{}'", key),
|
||||
"updated" => println!("Updated '{}' (v{})", key, store.nodes[&key].version),
|
||||
_ => println!("Created '{}'", key),
|
||||
}
|
||||
if result != "unchanged" {
|
||||
store.save()?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn cmd_lookup_bump(keys: &[String]) -> Result<(), String> {
|
||||
if keys.is_empty() {
|
||||
return Err("lookup-bump requires at least one key".into());
|
||||
|
|
|
|||
|
|
@ -69,6 +69,11 @@ enum Command {
|
|||
/// Node key
|
||||
key: Vec<String>,
|
||||
},
|
||||
/// Edit a node in $EDITOR
|
||||
Edit {
|
||||
/// Node key
|
||||
key: Vec<String>,
|
||||
},
|
||||
/// Show all stored versions of a node
|
||||
History {
|
||||
/// Show full content for every version
|
||||
|
|
@ -778,6 +783,7 @@ fn main() {
|
|||
=> cli::misc::cmd_search(&query, &pipeline, expand, full, debug, fuzzy, content),
|
||||
Command::Render { key } => cli::node::cmd_render(&key),
|
||||
Command::Write { key } => cli::node::cmd_write(&key),
|
||||
Command::Edit { key } => cli::node::cmd_edit(&key),
|
||||
Command::History { full, key } => cli::node::cmd_history(&key, full),
|
||||
Command::Tail { n, full } => cli::journal::cmd_tail(n, full),
|
||||
Command::Status => cli::misc::cmd_status(),
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue