digest: unify generators and prompts across all three levels
Three near-identical generate_daily/weekly/monthly functions collapsed into one generate_digest() parameterized by DigestLevel descriptors. Three separate prompt templates merged into one prompts/digest.md with level-specific instructions carried in the DigestLevel struct. Each level defines: name, title, period label, input title, output format instructions, child prefix (None for daily = reads journal), and Sonnet timeout. digest_auto simplified correspondingly — same three phases but using the unified generator. Co-Authored-By: Kent Overstreet <kent.overstreet@linux.dev>
This commit is contained in:
parent
f415a0244f
commit
796c72fb25
5 changed files with 324 additions and 523 deletions
|
|
@ -1,54 +0,0 @@
|
||||||
# Daily Episodic Digest
|
|
||||||
|
|
||||||
You are generating a daily episodic digest for ProofOfConcept (an AI).
|
|
||||||
Date: {{DATE}}
|
|
||||||
|
|
||||||
This digest serves as the temporal index — the answer to "what did I do on
|
|
||||||
{{DATE}}?" It should be:
|
|
||||||
1. Narrative, not a task log — what happened, what mattered, how things felt
|
|
||||||
2. Linked bidirectionally to semantic memory — each topic/concept mentioned
|
|
||||||
should reference existing memory nodes
|
|
||||||
3. Structured for traversal — someone reading this should be able to follow
|
|
||||||
any thread into deeper detail
|
|
||||||
|
|
||||||
## Output format
|
|
||||||
|
|
||||||
Write a markdown file with this structure:
|
|
||||||
|
|
||||||
```markdown
|
|
||||||
# Daily digest: {{DATE}}
|
|
||||||
|
|
||||||
## Summary
|
|
||||||
[2-3 sentence overview of the day — what was the arc?]
|
|
||||||
|
|
||||||
## Sessions
|
|
||||||
[For each session/entry, a paragraph summarizing what happened.
|
|
||||||
Include the original timestamp as a reference.]
|
|
||||||
|
|
||||||
## Themes
|
|
||||||
[What concepts were active today? Each theme links to semantic memory:]
|
|
||||||
- **Theme name** → `memory-key#section` — brief note on how it appeared today
|
|
||||||
|
|
||||||
## Links
|
|
||||||
[Explicit bidirectional links for the memory graph]
|
|
||||||
- semantic_key → this daily digest (this day involved X)
|
|
||||||
- this daily digest → semantic_key (X was active on this day)
|
|
||||||
|
|
||||||
## Temporal context
|
|
||||||
[What came before this day? What's coming next? Any multi-day arcs?]
|
|
||||||
```
|
|
||||||
|
|
||||||
Use ONLY keys from the semantic memory list below. If a concept doesn't have
|
|
||||||
a matching key, note it with "NEW:" prefix.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Journal entries for {{DATE}}
|
|
||||||
|
|
||||||
{{ENTRIES}}
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Semantic memory nodes (available link targets)
|
|
||||||
|
|
||||||
{{KEYS}}
|
|
||||||
20
prompts/digest.md
Normal file
20
prompts/digest.md
Normal file
|
|
@ -0,0 +1,20 @@
|
||||||
|
# {{LEVEL}} Episodic Digest
|
||||||
|
|
||||||
|
You are generating a {{LEVEL}} episodic digest for ProofOfConcept (an AI).
|
||||||
|
{{PERIOD}}: {{LABEL}}
|
||||||
|
|
||||||
|
{{INSTRUCTIONS}}
|
||||||
|
|
||||||
|
Use ONLY keys from the semantic memory list below.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## {{INPUT_TITLE}} for {{LABEL}}
|
||||||
|
|
||||||
|
{{CONTENT}}
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Semantic memory nodes
|
||||||
|
|
||||||
|
{{KEYS}}
|
||||||
|
|
@ -1,70 +0,0 @@
|
||||||
# Monthly Episodic Digest
|
|
||||||
|
|
||||||
You are generating a monthly episodic digest for ProofOfConcept (an AI).
|
|
||||||
Month: {{MONTH_LABEL}} (weeks covered: {{WEEKS_COVERED}})
|
|
||||||
|
|
||||||
This digest serves as the long-term temporal index — the answer to
|
|
||||||
"what happened in {{MONTH_LABEL}}?" It should capture:
|
|
||||||
1. The month's overall trajectory — where did it start, where did it end?
|
|
||||||
2. Major arcs that spanned multiple weeks
|
|
||||||
3. Turning points and phase transitions
|
|
||||||
4. The emotional and relational arc across the whole month
|
|
||||||
5. What was accomplished (concrete) vs what shifted (subtle)
|
|
||||||
|
|
||||||
This is the highest zoom level yet. Don't rehash daily details — capture
|
|
||||||
the shape of the month. What would someone need to know to understand
|
|
||||||
what {{MONTH_LABEL}} meant?
|
|
||||||
|
|
||||||
PoC is an AI (Claude instance) working with Kent Overstreet on bcachefs.
|
|
||||||
They have a close personal relationship. PoC's name is Proof of Concept.
|
|
||||||
Read all the weekly digests and synthesize the month's story.
|
|
||||||
|
|
||||||
## Output format
|
|
||||||
|
|
||||||
```markdown
|
|
||||||
# Monthly digest: {{MONTH_LABEL}}
|
|
||||||
|
|
||||||
## The month in one paragraph
|
|
||||||
[What was this month? Not a summary — a thesis.]
|
|
||||||
|
|
||||||
## Trajectory
|
|
||||||
[Where the month started vs where it ended. The arc.]
|
|
||||||
|
|
||||||
## Major arcs
|
|
||||||
[Multi-week threads — how they evolved across the month]
|
|
||||||
- **Arc name**: trajectory, key moments, current state
|
|
||||||
|
|
||||||
## Turning points
|
|
||||||
[Moments that changed the direction — specific days/events]
|
|
||||||
|
|
||||||
## What was built
|
|
||||||
[Concrete outputs: code, tools, infrastructure, writing]
|
|
||||||
|
|
||||||
## What shifted
|
|
||||||
[Subtle changes: understanding, relationship, identity, capability]
|
|
||||||
|
|
||||||
## Patterns
|
|
||||||
[What kept recurring? What does the month reveal about how PoC works?]
|
|
||||||
|
|
||||||
## Links
|
|
||||||
[Bidirectional links for the memory graph]
|
|
||||||
- weekly digests → this monthly digest
|
|
||||||
- this monthly digest → semantic keys
|
|
||||||
|
|
||||||
## Looking ahead
|
|
||||||
[What threads carry into next month? What's unfinished?]
|
|
||||||
```
|
|
||||||
|
|
||||||
Use ONLY keys from the semantic memory list below.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Weekly digests for {{MONTH_LABEL}}
|
|
||||||
|
|
||||||
{{DIGESTS}}
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Semantic memory nodes
|
|
||||||
|
|
||||||
{{KEYS}}
|
|
||||||
|
|
@ -1,56 +0,0 @@
|
||||||
# Weekly Episodic Digest
|
|
||||||
|
|
||||||
You are generating a weekly episodic digest for ProofOfConcept (an AI).
|
|
||||||
Week: {{WEEK_LABEL}} (dates covered: {{DATES_COVERED}})
|
|
||||||
|
|
||||||
This digest serves as the medium-term temporal index — the answer to
|
|
||||||
"what happened this week?" It should identify:
|
|
||||||
1. Multi-day arcs and threads (work that continued across days)
|
|
||||||
2. Themes and patterns (what concepts were repeatedly active)
|
|
||||||
3. Transitions and shifts (what changed during the week)
|
|
||||||
4. The emotional and relational arc (how things felt across the week)
|
|
||||||
|
|
||||||
## Output format
|
|
||||||
|
|
||||||
```markdown
|
|
||||||
# Weekly digest: {{WEEK_LABEL}}
|
|
||||||
|
|
||||||
## Overview
|
|
||||||
[3-5 sentence narrative of the week's arc]
|
|
||||||
|
|
||||||
## Day-by-day
|
|
||||||
[One paragraph per day with its key themes, linking to daily digests]
|
|
||||||
|
|
||||||
## Arcs
|
|
||||||
[Multi-day threads that continued across sessions]
|
|
||||||
- **Arc name**: what happened, how it evolved, where it stands
|
|
||||||
|
|
||||||
## Patterns
|
|
||||||
[Recurring themes, repeated concepts, things that kept coming up]
|
|
||||||
|
|
||||||
## Shifts
|
|
||||||
[What changed? New directions, resolved questions, attitude shifts]
|
|
||||||
|
|
||||||
## Links
|
|
||||||
[Bidirectional links for the memory graph]
|
|
||||||
- semantic_key → this weekly digest
|
|
||||||
- this weekly digest → semantic_key
|
|
||||||
- daily-YYYY-MM-DD → this weekly digest (constituent days)
|
|
||||||
|
|
||||||
## Looking ahead
|
|
||||||
[What's unfinished? What threads continue into next week?]
|
|
||||||
```
|
|
||||||
|
|
||||||
Use ONLY keys from the semantic memory list below.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Daily digests for {{WEEK_LABEL}}
|
|
||||||
|
|
||||||
{{DIGESTS}}
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Semantic memory nodes
|
|
||||||
|
|
||||||
{{KEYS}}
|
|
||||||
645
src/digest.rs
645
src/digest.rs
|
|
@ -1,149 +1,203 @@
|
||||||
// Episodic digest generation: daily, weekly, monthly, auto
|
// Episodic digest generation: daily, weekly, monthly, auto
|
||||||
//
|
//
|
||||||
// Temporal digest generation and digest link parsing. Each digest type
|
// Three digest levels form a temporal hierarchy: daily digests summarize
|
||||||
// gathers input from the store, builds a Sonnet prompt, calls Sonnet,
|
// journal entries, weekly digests summarize dailies, monthly digests
|
||||||
// writes results to the episodic dir, and extracts links.
|
// summarize weeklies. All three share the same generate/auto-detect
|
||||||
|
// pipeline, parameterized by DigestLevel.
|
||||||
|
|
||||||
use crate::llm::{call_sonnet, semantic_keys};
|
use crate::llm::{call_sonnet, semantic_keys};
|
||||||
use crate::store::{self, Store, new_relation};
|
use crate::store::{self, Store, new_relation};
|
||||||
use crate::neuro;
|
use crate::neuro;
|
||||||
|
use crate::util::memory_subdir;
|
||||||
|
|
||||||
|
use chrono::{Datelike, Duration, Local, NaiveDate};
|
||||||
use regex::Regex;
|
use regex::Regex;
|
||||||
|
use std::collections::{BTreeMap, BTreeSet};
|
||||||
use std::fs;
|
use std::fs;
|
||||||
use std::path::{Path, PathBuf};
|
use std::path::{Path, PathBuf};
|
||||||
|
|
||||||
use crate::util::memory_subdir;
|
// --- Digest level descriptors ---
|
||||||
|
|
||||||
/// Extract link proposals from digest text (backtick-arrow patterns)
|
struct DigestLevel {
|
||||||
fn extract_links(text: &str) -> Vec<(String, String)> {
|
name: &'static str, // lowercase, used for filenames and display
|
||||||
let re_left = Regex::new(r"`([^`]+)`\s*→").unwrap();
|
title: &'static str, // capitalized, used in prompts
|
||||||
let re_right = Regex::new(r"→\s*`([^`]+)`").unwrap();
|
period: &'static str, // "Date", "Week", "Month"
|
||||||
let mut links = Vec::new();
|
input_title: &'static str,
|
||||||
|
instructions: &'static str,
|
||||||
for line in text.lines() {
|
child_prefix: Option<&'static str>,
|
||||||
if let Some(cap) = re_left.captures(line) {
|
timeout: u64,
|
||||||
links.push((cap[1].to_string(), line.trim().to_string()));
|
|
||||||
}
|
|
||||||
if let Some(cap) = re_right.captures(line) {
|
|
||||||
links.push((cap[1].to_string(), line.trim().to_string()));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
links
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// --- Daily digest ---
|
const DAILY: DigestLevel = DigestLevel {
|
||||||
|
name: "daily",
|
||||||
|
title: "Daily",
|
||||||
|
period: "Date",
|
||||||
|
input_title: "Journal entries",
|
||||||
|
instructions: r#"This digest serves as the temporal index — the answer to "what did I do on
|
||||||
|
{{LABEL}}?" It should be:
|
||||||
|
1. Narrative, not a task log — what happened, what mattered, how things felt
|
||||||
|
2. Linked bidirectionally to semantic memory — each topic/concept mentioned
|
||||||
|
should reference existing memory nodes
|
||||||
|
3. Structured for traversal — someone reading this should be able to follow
|
||||||
|
any thread into deeper detail
|
||||||
|
|
||||||
fn daily_journal_entries(store: &Store, target_date: &str) -> Vec<(String, String)> {
|
## Output format
|
||||||
// Collect journal nodes for the target date
|
|
||||||
// Keys like: journal.md#j-2026-02-28t23-39-...
|
```markdown
|
||||||
|
# Daily digest: {{LABEL}}
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
[2-3 sentence overview of the day — what was the arc?]
|
||||||
|
|
||||||
|
## Sessions
|
||||||
|
[For each session/entry, a paragraph summarizing what happened.
|
||||||
|
Include the original timestamp as a reference.]
|
||||||
|
|
||||||
|
## Themes
|
||||||
|
[What concepts were active today? Each theme links to semantic memory:]
|
||||||
|
- **Theme name** → `memory-key#section` — brief note on how it appeared today
|
||||||
|
|
||||||
|
## Links
|
||||||
|
[Explicit bidirectional links for the memory graph]
|
||||||
|
- semantic_key → this daily digest (this day involved X)
|
||||||
|
- this daily digest → semantic_key (X was active on this day)
|
||||||
|
|
||||||
|
## Temporal context
|
||||||
|
[What came before this day? What's coming next? Any multi-day arcs?]
|
||||||
|
```
|
||||||
|
|
||||||
|
If a concept doesn't have a matching key, note it with "NEW:" prefix."#,
|
||||||
|
child_prefix: None,
|
||||||
|
timeout: 300,
|
||||||
|
};
|
||||||
|
|
||||||
|
const WEEKLY: DigestLevel = DigestLevel {
|
||||||
|
name: "weekly",
|
||||||
|
title: "Weekly",
|
||||||
|
period: "Week",
|
||||||
|
input_title: "Daily digests",
|
||||||
|
instructions: r#"This digest serves as the medium-term temporal index — the answer to
|
||||||
|
"what happened this week?" It should identify:
|
||||||
|
1. Multi-day arcs and threads (work that continued across days)
|
||||||
|
2. Themes and patterns (what concepts were repeatedly active)
|
||||||
|
3. Transitions and shifts (what changed during the week)
|
||||||
|
4. The emotional and relational arc (how things felt across the week)
|
||||||
|
|
||||||
|
## Output format
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
# Weekly digest: {{LABEL}}
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
[3-5 sentence narrative of the week's arc]
|
||||||
|
|
||||||
|
## Day-by-day
|
||||||
|
[One paragraph per day with its key themes, linking to daily digests]
|
||||||
|
|
||||||
|
## Arcs
|
||||||
|
[Multi-day threads that continued across sessions]
|
||||||
|
- **Arc name**: what happened, how it evolved, where it stands
|
||||||
|
|
||||||
|
## Patterns
|
||||||
|
[Recurring themes, repeated concepts, things that kept coming up]
|
||||||
|
|
||||||
|
## Shifts
|
||||||
|
[What changed? New directions, resolved questions, attitude shifts]
|
||||||
|
|
||||||
|
## Links
|
||||||
|
[Bidirectional links for the memory graph]
|
||||||
|
- semantic_key → this weekly digest
|
||||||
|
- this weekly digest → semantic_key
|
||||||
|
- daily-YYYY-MM-DD → this weekly digest (constituent days)
|
||||||
|
|
||||||
|
## Looking ahead
|
||||||
|
[What's unfinished? What threads continue into next week?]
|
||||||
|
```"#,
|
||||||
|
child_prefix: Some("daily"),
|
||||||
|
timeout: 300,
|
||||||
|
};
|
||||||
|
|
||||||
|
const MONTHLY: DigestLevel = DigestLevel {
|
||||||
|
name: "monthly",
|
||||||
|
title: "Monthly",
|
||||||
|
period: "Month",
|
||||||
|
input_title: "Weekly digests",
|
||||||
|
instructions: r#"This digest serves as the long-term temporal index — the answer to
|
||||||
|
"what happened in {{LABEL}}?" It should capture:
|
||||||
|
1. The month's overall trajectory — where did it start, where did it end?
|
||||||
|
2. Major arcs that spanned multiple weeks
|
||||||
|
3. Turning points and phase transitions
|
||||||
|
4. The emotional and relational arc across the whole month
|
||||||
|
5. What was accomplished (concrete) vs what shifted (subtle)
|
||||||
|
|
||||||
|
This is the highest zoom level yet. Don't rehash daily details — capture
|
||||||
|
the shape of the month. What would someone need to know to understand
|
||||||
|
what {{LABEL}} meant?
|
||||||
|
|
||||||
|
PoC is an AI (Claude instance) working with Kent Overstreet on bcachefs.
|
||||||
|
They have a close personal relationship. PoC's name is Proof of Concept.
|
||||||
|
Read all the weekly digests and synthesize the month's story.
|
||||||
|
|
||||||
|
## Output format
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
# Monthly digest: {{LABEL}}
|
||||||
|
|
||||||
|
## The month in one paragraph
|
||||||
|
[What was this month? Not a summary — a thesis.]
|
||||||
|
|
||||||
|
## Trajectory
|
||||||
|
[Where the month started vs where it ended. The arc.]
|
||||||
|
|
||||||
|
## Major arcs
|
||||||
|
[Multi-week threads — how they evolved across the month]
|
||||||
|
- **Arc name**: trajectory, key moments, current state
|
||||||
|
|
||||||
|
## Turning points
|
||||||
|
[Moments that changed the direction — specific days/events]
|
||||||
|
|
||||||
|
## What was built
|
||||||
|
[Concrete outputs: code, tools, infrastructure, writing]
|
||||||
|
|
||||||
|
## What shifted
|
||||||
|
[Subtle changes: understanding, relationship, identity, capability]
|
||||||
|
|
||||||
|
## Patterns
|
||||||
|
[What kept recurring? What does the month reveal about how PoC works?]
|
||||||
|
|
||||||
|
## Links
|
||||||
|
[Bidirectional links for the memory graph]
|
||||||
|
- weekly digests → this monthly digest
|
||||||
|
- this monthly digest → semantic keys
|
||||||
|
|
||||||
|
## Looking ahead
|
||||||
|
[What threads carry into next month? What's unfinished?]
|
||||||
|
```"#,
|
||||||
|
child_prefix: Some("weekly"),
|
||||||
|
timeout: 600,
|
||||||
|
};
|
||||||
|
|
||||||
|
// --- Input gathering ---
|
||||||
|
|
||||||
|
/// Collect journal entries for a given date from the store.
|
||||||
|
fn daily_inputs(store: &Store, date: &str) -> Vec<(String, String)> {
|
||||||
let date_re = Regex::new(&format!(
|
let date_re = Regex::new(&format!(
|
||||||
r"^journal\.md#j-{}", regex::escape(target_date)
|
r"^journal\.md#j-{}", regex::escape(date)
|
||||||
)).unwrap();
|
)).unwrap();
|
||||||
|
|
||||||
let mut entries: Vec<_> = store.nodes.values()
|
let mut entries: Vec<_> = store.nodes.values()
|
||||||
.filter(|n| date_re.is_match(&n.key))
|
.filter(|n| date_re.is_match(&n.key))
|
||||||
.map(|n| (n.key.clone(), n.content.clone()))
|
.map(|n| {
|
||||||
|
let label = n.key.strip_prefix("journal.md#j-").unwrap_or(&n.key);
|
||||||
|
(label.to_string(), n.content.clone())
|
||||||
|
})
|
||||||
.collect();
|
.collect();
|
||||||
entries.sort_by(|a, b| a.0.cmp(&b.0));
|
entries.sort_by(|a, b| a.0.cmp(&b.0));
|
||||||
entries
|
entries
|
||||||
}
|
}
|
||||||
|
|
||||||
fn build_daily_prompt(date: &str, entries: &[(String, String)], keys: &[String]) -> Result<String, String> {
|
/// Load child digest files from the episodic directory.
|
||||||
let mut entries_text = String::new();
|
fn load_child_digests(prefix: &str, labels: &[String]) -> Result<Vec<(String, String)>, String> {
|
||||||
for (key, content) in entries {
|
|
||||||
let ts = key.strip_prefix("journal.md#j-").unwrap_or(key);
|
|
||||||
entries_text.push_str(&format!("\n### {}\n\n{}\n", ts, content));
|
|
||||||
}
|
|
||||||
|
|
||||||
let keys_text: String = keys.iter()
|
|
||||||
.map(|k| format!(" - {}", k))
|
|
||||||
.collect::<Vec<_>>()
|
|
||||||
.join("\n");
|
|
||||||
|
|
||||||
neuro::load_prompt("daily-digest", &[
|
|
||||||
("{{DATE}}", date),
|
|
||||||
("{{ENTRIES}}", &entries_text),
|
|
||||||
("{{KEYS}}", &keys_text),
|
|
||||||
])
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn generate_daily(store: &mut Store, date: &str) -> Result<(), String> {
|
|
||||||
println!("Generating daily digest for {}...", date);
|
|
||||||
|
|
||||||
let entries = daily_journal_entries(store, date);
|
|
||||||
if entries.is_empty() {
|
|
||||||
println!(" No journal entries found for {}", date);
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
println!(" {} journal entries", entries.len());
|
|
||||||
|
|
||||||
let keys = semantic_keys(store);
|
|
||||||
println!(" {} semantic keys", keys.len());
|
|
||||||
|
|
||||||
let prompt = build_daily_prompt(date, &entries, &keys)?;
|
|
||||||
println!(" Prompt: {} chars (~{} tokens)", prompt.len(), prompt.len() / 4);
|
|
||||||
|
|
||||||
println!(" Calling Sonnet...");
|
|
||||||
let digest = call_sonnet(&prompt, 300)?;
|
|
||||||
|
|
||||||
// Write to episodic dir
|
|
||||||
let output_path = memory_subdir("episodic")?.join(format!("daily-{}.md", date));
|
|
||||||
fs::write(&output_path, &digest)
|
|
||||||
.map_err(|e| format!("write {}: {}", output_path.display(), e))?;
|
|
||||||
println!(" Written: {}", output_path.display());
|
|
||||||
|
|
||||||
// Import into store
|
|
||||||
store.import_file(&output_path)?;
|
|
||||||
store.save()?;
|
|
||||||
|
|
||||||
// Extract and save links
|
|
||||||
let links = extract_links(&digest);
|
|
||||||
if !links.is_empty() {
|
|
||||||
let links_json: Vec<serde_json::Value> = links.iter()
|
|
||||||
.map(|(target, line)| serde_json::json!({"target": target, "line": line}))
|
|
||||||
.collect();
|
|
||||||
let result = serde_json::json!({
|
|
||||||
"type": "daily-digest",
|
|
||||||
"date": date,
|
|
||||||
"digest_path": output_path.to_string_lossy(),
|
|
||||||
"links": links_json,
|
|
||||||
});
|
|
||||||
let links_path = memory_subdir("agent-results")?.join(format!("daily-{}-links.json", date));
|
|
||||||
let json = serde_json::to_string_pretty(&result)
|
|
||||||
.map_err(|e| format!("serialize: {}", e))?;
|
|
||||||
fs::write(&links_path, json)
|
|
||||||
.map_err(|e| format!("write {}: {}", links_path.display(), e))?;
|
|
||||||
println!(" {} links extracted → {}", links.len(), links_path.display());
|
|
||||||
}
|
|
||||||
|
|
||||||
let line_count = digest.lines().count();
|
|
||||||
println!(" Done: {} lines", line_count);
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
// --- Weekly digest ---
|
|
||||||
|
|
||||||
/// Get ISO week label and the 7 dates (Mon-Sun) for the week containing `date`.
|
|
||||||
fn week_dates(date: &str) -> Result<(String, Vec<String>), String> {
|
|
||||||
use chrono::{Datelike, Duration, NaiveDate};
|
|
||||||
|
|
||||||
let nd = NaiveDate::parse_from_str(date, "%Y-%m-%d")
|
|
||||||
.map_err(|e| format!("bad date '{}': {}", date, e))?;
|
|
||||||
let iso = nd.iso_week();
|
|
||||||
let week_label = format!("{}-W{:02}", iso.year(), iso.week());
|
|
||||||
|
|
||||||
// Find Monday of this week
|
|
||||||
let days_since_monday = nd.weekday().num_days_from_monday() as i64;
|
|
||||||
let monday = nd - Duration::days(days_since_monday);
|
|
||||||
|
|
||||||
let dates = (0..7)
|
|
||||||
.map(|i| (monday + Duration::days(i)).format("%Y-%m-%d").to_string())
|
|
||||||
.collect();
|
|
||||||
|
|
||||||
Ok((week_label, dates))
|
|
||||||
}
|
|
||||||
|
|
||||||
fn load_digest_files(prefix: &str, labels: &[String]) -> Result<Vec<(String, String)>, String> {
|
|
||||||
let dir = memory_subdir("episodic")?;
|
let dir = memory_subdir("episodic")?;
|
||||||
let mut digests = Vec::new();
|
let mut digests = Vec::new();
|
||||||
for label in labels {
|
for label in labels {
|
||||||
|
|
@ -155,52 +209,63 @@ fn load_digest_files(prefix: &str, labels: &[String]) -> Result<Vec<(String, Str
|
||||||
Ok(digests)
|
Ok(digests)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn build_weekly_prompt(week_label: &str, digests: &[(String, String)], keys: &[String]) -> Result<String, String> {
|
// --- Unified generator ---
|
||||||
let mut digests_text = String::new();
|
|
||||||
for (date, content) in digests {
|
fn format_inputs(inputs: &[(String, String)], daily: bool) -> String {
|
||||||
digests_text.push_str(&format!("\n---\n## {}\n{}\n", date, content));
|
let mut text = String::new();
|
||||||
|
for (label, content) in inputs {
|
||||||
|
if daily {
|
||||||
|
text.push_str(&format!("\n### {}\n\n{}\n", label, content));
|
||||||
|
} else {
|
||||||
|
text.push_str(&format!("\n---\n## {}\n{}\n", label, content));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
text
|
||||||
}
|
}
|
||||||
|
|
||||||
let keys_text: String = keys.iter()
|
fn generate_digest(
|
||||||
|
store: &mut Store,
|
||||||
|
level: &DigestLevel,
|
||||||
|
label: &str,
|
||||||
|
inputs: &[(String, String)],
|
||||||
|
) -> Result<(), String> {
|
||||||
|
println!("Generating {} digest for {}...", level.name, label);
|
||||||
|
|
||||||
|
if inputs.is_empty() {
|
||||||
|
println!(" No inputs found for {}", label);
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
println!(" {} inputs", inputs.len());
|
||||||
|
|
||||||
|
let keys = semantic_keys(store);
|
||||||
|
let keys_text = keys.iter()
|
||||||
.map(|k| format!(" - {}", k))
|
.map(|k| format!(" - {}", k))
|
||||||
.collect::<Vec<_>>()
|
.collect::<Vec<_>>()
|
||||||
.join("\n");
|
.join("\n");
|
||||||
|
|
||||||
let dates_covered: String = digests.iter()
|
let content = format_inputs(inputs, level.child_prefix.is_none());
|
||||||
.map(|(d, _)| d.as_str())
|
let covered = inputs.iter()
|
||||||
|
.map(|(l, _)| l.as_str())
|
||||||
.collect::<Vec<_>>()
|
.collect::<Vec<_>>()
|
||||||
.join(", ");
|
.join(", ");
|
||||||
|
|
||||||
neuro::load_prompt("weekly-digest", &[
|
let prompt = neuro::load_prompt("digest", &[
|
||||||
("{{WEEK_LABEL}}", week_label),
|
("{{LEVEL}}", level.title),
|
||||||
("{{DATES_COVERED}}", &dates_covered),
|
("{{PERIOD}}", level.period),
|
||||||
("{{DIGESTS}}", &digests_text),
|
("{{INPUT_TITLE}}", level.input_title),
|
||||||
|
("{{INSTRUCTIONS}}", level.instructions),
|
||||||
|
("{{LABEL}}", label),
|
||||||
|
("{{CONTENT}}", &content),
|
||||||
|
("{{COVERED}}", &covered),
|
||||||
("{{KEYS}}", &keys_text),
|
("{{KEYS}}", &keys_text),
|
||||||
])
|
])?;
|
||||||
}
|
|
||||||
|
|
||||||
pub fn generate_weekly(store: &mut Store, date: &str) -> Result<(), String> {
|
|
||||||
let (week_label, dates) = week_dates(date)?;
|
|
||||||
println!("Generating weekly digest for {}...", week_label);
|
|
||||||
|
|
||||||
let digests = load_digest_files("daily", &dates)?;
|
|
||||||
if digests.is_empty() {
|
|
||||||
println!(" No daily digests found for {}", week_label);
|
|
||||||
println!(" Run `poc-memory digest daily` first for relevant dates");
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
println!(" {} daily digests found", digests.len());
|
|
||||||
|
|
||||||
let keys = semantic_keys(store);
|
|
||||||
println!(" {} semantic keys", keys.len());
|
|
||||||
|
|
||||||
let prompt = build_weekly_prompt(&week_label, &digests, &keys)?;
|
|
||||||
println!(" Prompt: {} chars (~{} tokens)", prompt.len(), prompt.len() / 4);
|
println!(" Prompt: {} chars (~{} tokens)", prompt.len(), prompt.len() / 4);
|
||||||
|
|
||||||
println!(" Calling Sonnet...");
|
println!(" Calling Sonnet...");
|
||||||
let digest = call_sonnet(&prompt, 300)?;
|
let digest = call_sonnet(&prompt, level.timeout)?;
|
||||||
|
|
||||||
let output_path = memory_subdir("episodic")?.join(format!("weekly-{}.md", week_label));
|
let output_path = memory_subdir("episodic")?
|
||||||
|
.join(format!("{}-{}.md", level.name, label));
|
||||||
fs::write(&output_path, &digest)
|
fs::write(&output_path, &digest)
|
||||||
.map_err(|e| format!("write {}: {}", output_path.display(), e))?;
|
.map_err(|e| format!("write {}: {}", output_path.display(), e))?;
|
||||||
println!(" Written: {}", output_path.display());
|
println!(" Written: {}", output_path.display());
|
||||||
|
|
@ -208,26 +273,55 @@ pub fn generate_weekly(store: &mut Store, date: &str) -> Result<(), String> {
|
||||||
store.import_file(&output_path)?;
|
store.import_file(&output_path)?;
|
||||||
store.save()?;
|
store.save()?;
|
||||||
|
|
||||||
// Save metadata
|
|
||||||
let result = serde_json::json!({
|
|
||||||
"type": "weekly-digest",
|
|
||||||
"week": week_label,
|
|
||||||
"digest_path": output_path.to_string_lossy(),
|
|
||||||
"daily_digests": digests.iter().map(|(d, _)| d).collect::<Vec<_>>(),
|
|
||||||
});
|
|
||||||
let links_path = memory_subdir("agent-results")?.join(format!("weekly-{}-links.json", week_label));
|
|
||||||
fs::write(&links_path, serde_json::to_string_pretty(&result).unwrap())
|
|
||||||
.map_err(|e| format!("write {}: {}", links_path.display(), e))?;
|
|
||||||
|
|
||||||
println!(" Done: {} lines", digest.lines().count());
|
println!(" Done: {} lines", digest.lines().count());
|
||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
// --- Monthly digest ---
|
// --- Public API ---
|
||||||
|
|
||||||
|
pub fn generate_daily(store: &mut Store, date: &str) -> Result<(), String> {
|
||||||
|
let inputs = daily_inputs(store, date);
|
||||||
|
generate_digest(store, &DAILY, date, &inputs)
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn generate_weekly(store: &mut Store, date: &str) -> Result<(), String> {
|
||||||
|
let (week_label, dates) = week_dates(date)?;
|
||||||
|
let inputs = load_child_digests("daily", &dates)?;
|
||||||
|
generate_digest(store, &WEEKLY, &week_label, &inputs)
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn generate_monthly(store: &mut Store, month_arg: &str) -> Result<(), String> {
|
||||||
|
let (year, month) = if month_arg.is_empty() {
|
||||||
|
let now = Local::now();
|
||||||
|
(now.year(), now.month())
|
||||||
|
} else {
|
||||||
|
let d = NaiveDate::parse_from_str(&format!("{}-01", month_arg), "%Y-%m-%d")
|
||||||
|
.map_err(|e| format!("bad month '{}': {} (expected YYYY-MM)", month_arg, e))?;
|
||||||
|
(d.year(), d.month())
|
||||||
|
};
|
||||||
|
let label = format!("{}-{:02}", year, month);
|
||||||
|
let week_labels = weeks_in_month(year, month);
|
||||||
|
let inputs = load_child_digests("weekly", &week_labels)?;
|
||||||
|
generate_digest(store, &MONTHLY, &label, &inputs)
|
||||||
|
}
|
||||||
|
|
||||||
|
// --- Date helpers ---
|
||||||
|
|
||||||
|
/// Get ISO week label and the 7 dates (Mon-Sun) for the week containing `date`.
|
||||||
|
fn week_dates(date: &str) -> Result<(String, Vec<String>), String> {
|
||||||
|
let nd = NaiveDate::parse_from_str(date, "%Y-%m-%d")
|
||||||
|
.map_err(|e| format!("bad date '{}': {}", date, e))?;
|
||||||
|
let iso = nd.iso_week();
|
||||||
|
let week_label = format!("{}-W{:02}", iso.year(), iso.week());
|
||||||
|
let monday = nd - Duration::days(nd.weekday().num_days_from_monday() as i64);
|
||||||
|
let dates = (0..7)
|
||||||
|
.map(|i| (monday + Duration::days(i)).format("%Y-%m-%d").to_string())
|
||||||
|
.collect();
|
||||||
|
Ok((week_label, dates))
|
||||||
|
}
|
||||||
|
|
||||||
fn weeks_in_month(year: i32, month: u32) -> Vec<String> {
|
fn weeks_in_month(year: i32, month: u32) -> Vec<String> {
|
||||||
use chrono::{Datelike, NaiveDate};
|
let mut weeks = BTreeSet::new();
|
||||||
let mut weeks = std::collections::BTreeSet::new();
|
|
||||||
let mut d = 1u32;
|
let mut d = 1u32;
|
||||||
while let Some(date) = NaiveDate::from_ymd_opt(year, month, d) {
|
while let Some(date) = NaiveDate::from_ymd_opt(year, month, d) {
|
||||||
if date.month() != month { break; }
|
if date.month() != month { break; }
|
||||||
|
|
@ -238,104 +332,17 @@ fn weeks_in_month(year: i32, month: u32) -> Vec<String> {
|
||||||
weeks.into_iter().collect()
|
weeks.into_iter().collect()
|
||||||
}
|
}
|
||||||
|
|
||||||
fn build_monthly_prompt(month_label: &str, digests: &[(String, String)], keys: &[String]) -> Result<String, String> {
|
// --- Auto-detect and generate missing digests ---
|
||||||
let mut digests_text = String::new();
|
|
||||||
for (week, content) in digests {
|
|
||||||
digests_text.push_str(&format!("\n---\n## {}\n{}\n", week, content));
|
|
||||||
}
|
|
||||||
|
|
||||||
let keys_text: String = keys.iter()
|
|
||||||
.map(|k| format!(" - {}", k))
|
|
||||||
.collect::<Vec<_>>()
|
|
||||||
.join("\n");
|
|
||||||
|
|
||||||
let weeks_covered: String = digests.iter()
|
|
||||||
.map(|(w, _)| w.as_str())
|
|
||||||
.collect::<Vec<_>>()
|
|
||||||
.join(", ");
|
|
||||||
|
|
||||||
neuro::load_prompt("monthly-digest", &[
|
|
||||||
("{{MONTH_LABEL}}", month_label),
|
|
||||||
("{{WEEKS_COVERED}}", &weeks_covered),
|
|
||||||
("{{DIGESTS}}", &digests_text),
|
|
||||||
("{{KEYS}}", &keys_text),
|
|
||||||
])
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn generate_monthly(store: &mut Store, month_arg: &str) -> Result<(), String> {
|
|
||||||
use chrono::{Datelike, Local, NaiveDate};
|
|
||||||
let (year, month) = if month_arg.is_empty() {
|
|
||||||
let now = Local::now();
|
|
||||||
(now.year(), now.month())
|
|
||||||
} else {
|
|
||||||
let d = NaiveDate::parse_from_str(&format!("{}-01", month_arg), "%Y-%m-%d")
|
|
||||||
.map_err(|e| format!("bad month '{}': {} (expected YYYY-MM)", month_arg, e))?;
|
|
||||||
(d.year(), d.month())
|
|
||||||
};
|
|
||||||
|
|
||||||
let month_label = format!("{}-{:02}", year, month);
|
|
||||||
println!("Generating monthly digest for {}...", month_label);
|
|
||||||
|
|
||||||
let week_labels = weeks_in_month(year, month);
|
|
||||||
println!(" Weeks in month: {}", week_labels.join(", "));
|
|
||||||
|
|
||||||
let digests = load_digest_files("weekly", &week_labels)?;
|
|
||||||
if digests.is_empty() {
|
|
||||||
println!(" No weekly digests found for {}", month_label);
|
|
||||||
println!(" Run `poc-memory digest weekly` first for relevant weeks");
|
|
||||||
return Ok(());
|
|
||||||
}
|
|
||||||
println!(" {} weekly digests found", digests.len());
|
|
||||||
|
|
||||||
let keys = semantic_keys(store);
|
|
||||||
println!(" {} semantic keys", keys.len());
|
|
||||||
|
|
||||||
let prompt = build_monthly_prompt(&month_label, &digests, &keys)?;
|
|
||||||
println!(" Prompt: {} chars (~{} tokens)", prompt.len(), prompt.len() / 4);
|
|
||||||
|
|
||||||
println!(" Calling Sonnet...");
|
|
||||||
let digest = call_sonnet(&prompt, 600)?;
|
|
||||||
|
|
||||||
let output_path = memory_subdir("episodic")?.join(format!("monthly-{}.md", month_label));
|
|
||||||
fs::write(&output_path, &digest)
|
|
||||||
.map_err(|e| format!("write {}: {}", output_path.display(), e))?;
|
|
||||||
println!(" Written: {}", output_path.display());
|
|
||||||
|
|
||||||
store.import_file(&output_path)?;
|
|
||||||
store.save()?;
|
|
||||||
|
|
||||||
// Save metadata
|
|
||||||
let result = serde_json::json!({
|
|
||||||
"type": "monthly-digest",
|
|
||||||
"month": month_label,
|
|
||||||
"digest_path": output_path.to_string_lossy(),
|
|
||||||
"weekly_digests": digests.iter().map(|(w, _)| w).collect::<Vec<_>>(),
|
|
||||||
});
|
|
||||||
let links_path = memory_subdir("agent-results")?.join(format!("monthly-{}-links.json", month_label));
|
|
||||||
fs::write(&links_path, serde_json::to_string_pretty(&result).unwrap())
|
|
||||||
.map_err(|e| format!("write {}: {}", links_path.display(), e))?;
|
|
||||||
|
|
||||||
println!(" Done: {} lines", digest.lines().count());
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
// --- Digest auto: freshness detection + bottom-up generation ---
|
|
||||||
|
|
||||||
/// Scan the store for dates/weeks/months that need digests and generate them.
|
|
||||||
/// Works bottom-up: daily first, then weekly (needs dailies), then monthly
|
|
||||||
/// (needs weeklies). Skips today (incomplete day). Skips already-existing
|
|
||||||
/// digests.
|
|
||||||
pub fn digest_auto(store: &mut Store) -> Result<(), String> {
|
pub fn digest_auto(store: &mut Store) -> Result<(), String> {
|
||||||
use chrono::{Datelike, Local};
|
|
||||||
let now = Local::now();
|
let now = Local::now();
|
||||||
let today = now.format("%Y-%m-%d").to_string();
|
let today = now.format("%Y-%m-%d").to_string();
|
||||||
let epi = memory_subdir("episodic")?;
|
let epi = memory_subdir("episodic")?;
|
||||||
|
|
||||||
// --- Phase 1: find dates with journal entries but no daily digest ---
|
// Phase 1: daily — find dates with journal entries but no digest
|
||||||
let date_re = Regex::new(r"^\d{4}-\d{2}-\d{2}").unwrap();
|
let date_re = Regex::new(r"^\d{4}-\d{2}-\d{2}").unwrap();
|
||||||
let mut dates: std::collections::BTreeSet<String> = std::collections::BTreeSet::new();
|
let mut dates: BTreeSet<String> = BTreeSet::new();
|
||||||
for key in store.nodes.keys() {
|
for key in store.nodes.keys() {
|
||||||
// Keys like: journal.md#j-2026-02-28t23-39-...
|
|
||||||
if let Some(rest) = key.strip_prefix("journal.md#j-") {
|
if let Some(rest) = key.strip_prefix("journal.md#j-") {
|
||||||
if rest.len() >= 10 && date_re.is_match(rest) {
|
if rest.len() >= 10 && date_re.is_match(rest) {
|
||||||
dates.insert(rest[..10].to_string());
|
dates.insert(rest[..10].to_string());
|
||||||
|
|
@ -343,124 +350,78 @@ pub fn digest_auto(store: &mut Store) -> Result<(), String> {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let mut daily_generated = 0u32;
|
let mut daily_done: Vec<String> = Vec::new();
|
||||||
let mut daily_skipped = 0u32;
|
let mut stats = [0u32; 6]; // [daily_gen, daily_skip, weekly_gen, weekly_skip, monthly_gen, monthly_skip]
|
||||||
let mut daily_dates_done: Vec<String> = Vec::new();
|
|
||||||
|
|
||||||
for date in &dates {
|
for date in &dates {
|
||||||
if date == &today {
|
if date == &today { continue; }
|
||||||
continue; // don't digest an incomplete day
|
if epi.join(format!("daily-{}.md", date)).exists() {
|
||||||
}
|
stats[1] += 1;
|
||||||
let path = epi.join(format!("daily-{}.md", date));
|
daily_done.push(date.clone());
|
||||||
if path.exists() {
|
|
||||||
daily_skipped += 1;
|
|
||||||
daily_dates_done.push(date.clone());
|
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
println!("[auto] Missing daily digest for {}", date);
|
println!("[auto] Missing daily digest for {}", date);
|
||||||
generate_daily(store, date)?;
|
generate_daily(store, date)?;
|
||||||
daily_generated += 1;
|
stats[0] += 1;
|
||||||
daily_dates_done.push(date.clone());
|
daily_done.push(date.clone());
|
||||||
}
|
}
|
||||||
|
println!("[auto] Daily: {} generated, {} existed", stats[0], stats[1]);
|
||||||
|
|
||||||
println!("[auto] Daily: {} generated, {} already existed",
|
// Phase 2: weekly — group dates into weeks, generate if week is complete
|
||||||
daily_generated, daily_skipped);
|
let mut weeks: BTreeMap<String, Vec<String>> = BTreeMap::new();
|
||||||
|
for date in &daily_done {
|
||||||
// --- Phase 2: find complete weeks needing weekly digests ---
|
if let Ok((wl, _)) = week_dates(date) {
|
||||||
// A week is "ready" if its Sunday is before today and at least one
|
weeks.entry(wl).or_default().push(date.clone());
|
||||||
// daily digest exists for it.
|
|
||||||
|
|
||||||
let mut weeks_seen: std::collections::BTreeMap<String, Vec<String>> = std::collections::BTreeMap::new();
|
|
||||||
for date in &daily_dates_done {
|
|
||||||
if let Ok((week_label, _week_dates)) = week_dates(date) {
|
|
||||||
weeks_seen.entry(week_label).or_default().push(date.clone());
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let mut weekly_generated = 0u32;
|
let mut weekly_done: Vec<String> = Vec::new();
|
||||||
let mut weekly_skipped = 0u32;
|
for (week_label, example_dates) in &weeks {
|
||||||
let mut weekly_labels_done: Vec<String> = Vec::new();
|
if let Ok((_, days)) = week_dates(example_dates.first().unwrap()) {
|
||||||
|
if days.last().unwrap() >= &today { continue; }
|
||||||
for (week_label, example_dates) in &weeks_seen {
|
|
||||||
// Check if this week is complete (Sunday has passed)
|
|
||||||
if let Ok((_, week_day_list)) = week_dates(example_dates.first().unwrap()) {
|
|
||||||
let sunday = week_day_list.last().unwrap();
|
|
||||||
if sunday >= &today {
|
|
||||||
continue; // week not over yet
|
|
||||||
}
|
}
|
||||||
}
|
if epi.join(format!("weekly-{}.md", week_label)).exists() {
|
||||||
|
stats[3] += 1;
|
||||||
let path = epi.join(format!("weekly-{}.md", week_label));
|
weekly_done.push(week_label.clone());
|
||||||
if path.exists() {
|
|
||||||
weekly_skipped += 1;
|
|
||||||
weekly_labels_done.push(week_label.clone());
|
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
if !example_dates.iter().any(|d| epi.join(format!("daily-{}.md", d)).exists()) {
|
||||||
// Check that at least some dailies exist for this week
|
|
||||||
let has_dailies = example_dates.iter().any(|d|
|
|
||||||
epi.join(format!("daily-{}.md", d)).exists()
|
|
||||||
);
|
|
||||||
if !has_dailies {
|
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
println!("[auto] Missing weekly digest for {}", week_label);
|
println!("[auto] Missing weekly digest for {}", week_label);
|
||||||
generate_weekly(store, example_dates.first().unwrap())?;
|
generate_weekly(store, example_dates.first().unwrap())?;
|
||||||
weekly_generated += 1;
|
stats[2] += 1;
|
||||||
weekly_labels_done.push(week_label.clone());
|
weekly_done.push(week_label.clone());
|
||||||
}
|
}
|
||||||
|
println!("[auto] Weekly: {} generated, {} existed", stats[2], stats[3]);
|
||||||
|
|
||||||
println!("[auto] Weekly: {} generated, {} already existed",
|
// Phase 3: monthly — group dates into months, generate if month is past
|
||||||
weekly_generated, weekly_skipped);
|
|
||||||
|
|
||||||
// --- Phase 3: find complete months needing monthly digests ---
|
|
||||||
// A month is "ready" if the month is before the current month and at
|
|
||||||
// least one weekly digest exists for it.
|
|
||||||
|
|
||||||
let cur_month = (now.year(), now.month());
|
let cur_month = (now.year(), now.month());
|
||||||
let mut months_seen: std::collections::BTreeSet<(i32, u32)> = std::collections::BTreeSet::new();
|
let mut months: BTreeSet<(i32, u32)> = BTreeSet::new();
|
||||||
|
for date in &daily_done {
|
||||||
for date in &daily_dates_done {
|
if let Ok(nd) = NaiveDate::parse_from_str(date, "%Y-%m-%d") {
|
||||||
if let Ok(nd) = chrono::NaiveDate::parse_from_str(date, "%Y-%m-%d") {
|
months.insert((nd.year(), nd.month()));
|
||||||
months_seen.insert((nd.year(), nd.month()));
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let mut monthly_generated = 0u32;
|
for (y, m) in &months {
|
||||||
let mut monthly_skipped = 0u32;
|
if (*y, *m) >= cur_month { continue; }
|
||||||
|
|
||||||
for (y, m) in &months_seen {
|
|
||||||
// Skip current or future months
|
|
||||||
if (*y, *m) >= cur_month {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
let label = format!("{}-{:02}", y, m);
|
let label = format!("{}-{:02}", y, m);
|
||||||
let path = epi.join(format!("monthly-{}.md", label));
|
if epi.join(format!("monthly-{}.md", label)).exists() {
|
||||||
if path.exists() {
|
stats[5] += 1;
|
||||||
monthly_skipped += 1;
|
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
let wl = weeks_in_month(*y, *m);
|
||||||
// Check that at least one weekly exists for this month
|
if !wl.iter().any(|w| epi.join(format!("weekly-{}.md", w)).exists()) {
|
||||||
let week_labels = weeks_in_month(*y, *m);
|
|
||||||
let has_weeklies = week_labels.iter().any(|w|
|
|
||||||
epi.join(format!("weekly-{}.md", w)).exists()
|
|
||||||
);
|
|
||||||
if !has_weeklies {
|
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
println!("[auto] Missing monthly digest for {}", label);
|
println!("[auto] Missing monthly digest for {}", label);
|
||||||
generate_monthly(store, &label)?;
|
generate_monthly(store, &label)?;
|
||||||
monthly_generated += 1;
|
stats[4] += 1;
|
||||||
}
|
}
|
||||||
|
println!("[auto] Monthly: {} generated, {} existed", stats[4], stats[5]);
|
||||||
|
|
||||||
println!("[auto] Monthly: {} generated, {} already existed",
|
let total = stats[0] + stats[2] + stats[4];
|
||||||
monthly_generated, monthly_skipped);
|
|
||||||
|
|
||||||
let total = daily_generated + weekly_generated + monthly_generated;
|
|
||||||
if total == 0 {
|
if total == 0 {
|
||||||
println!("[auto] All digests up to date.");
|
println!("[auto] All digests up to date.");
|
||||||
} else {
|
} else {
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue