forked from kent/consciousness
config: global writable AppConfig; learn settings live there
Runtime-mutable settings (F6's threshold knob, the generate-alternates toggle, anything else that comes along) were ending up as mirrored fields on MindState — each new config setting grew MindState::new's signature and added a clone+sync path. Wrong home. MindState is ephemeral session state, not a config projection. Give AppConfig the same treatment the memory Config has: install it into a global RwLock<AppConfig> at startup via load_app, read through config::app() (returns a read guard), mutate through update_app. The config_writer functions now write to disk AND update the cache atomically, so the one-stop-shop call keeps both in sync. Also while in here: - learn.generate_alternates moves from a sentinel file (~/.consciousness/cache/finetune-alternates, "exists = enabled") into the config under the learn section. On first run with this build, if the sentinel file still exists Mind::new flips the config value to true and removes it. Drops alternates_enabled()/set_alternates(). - Default threshold 0.0000001 → 1.0. With the timestamp filter removed the previous value was letting essentially everything through; 1.0 is a sane "nothing gets through unless you actually want it" default. - score_finetune_candidates takes generate_alternates as a parameter instead of reading a global — caller snapshots the config values once at the top of start_finetune_scoring so the async task doesn't need to hold the config read lock across awaits. - MindState.learn_threshold / learn_generate_alternates gone; the SetLearn* command handlers now just delegate to config_writer. Kent noted RwLock<Arc<AppConfig>> (the pattern used by the memory Config global) is pointless here — nobody needs a snapshot-after- release, reads are short — so this uses a plain RwLock<AppConfig> and returns a read guard. Co-Authored-By: Proof of Concept <poc@bcachefs.org>
This commit is contained in:
parent
343e43afab
commit
313f85f34a
5 changed files with 102 additions and 58 deletions
|
|
@ -504,6 +504,7 @@ pub async fn score_finetune_candidates(
|
|||
count: usize,
|
||||
client: &ApiClient,
|
||||
min_divergence: f64,
|
||||
generate_alternates: bool,
|
||||
activity: &crate::agent::ActivityGuard,
|
||||
mut on_candidate: impl FnMut(FinetuneCandidate),
|
||||
) -> anyhow::Result<(usize, f64)> {
|
||||
|
|
@ -558,7 +559,7 @@ pub async fn score_finetune_candidates(
|
|||
}
|
||||
|
||||
let total = candidates.len();
|
||||
let gen_alternates = alternates_enabled() && total > 0;
|
||||
let gen_alternates = generate_alternates && total > 0;
|
||||
|
||||
for (i, mut candidate) in candidates.into_iter().enumerate() {
|
||||
if gen_alternates {
|
||||
|
|
@ -616,35 +617,12 @@ async fn generate_alternate(
|
|||
use std::path::PathBuf;
|
||||
use std::collections::HashSet;
|
||||
|
||||
const FINETUNE_ALTERNATES_FILE: &str = ".consciousness/cache/finetune-alternates";
|
||||
const TRAINED_RESPONSES_FILE: &str = ".consciousness/cache/trained-responses.json";
|
||||
|
||||
fn alternates_path() -> PathBuf {
|
||||
dirs::home_dir().unwrap_or_default().join(FINETUNE_ALTERNATES_FILE)
|
||||
}
|
||||
|
||||
fn trained_path() -> PathBuf {
|
||||
dirs::home_dir().unwrap_or_default().join(TRAINED_RESPONSES_FILE)
|
||||
}
|
||||
|
||||
/// Check if alternate response generation is enabled.
|
||||
pub fn alternates_enabled() -> bool {
|
||||
alternates_path().exists()
|
||||
}
|
||||
|
||||
/// Toggle alternate response generation and persist the setting.
|
||||
pub fn set_alternates(enabled: bool) {
|
||||
let path = alternates_path();
|
||||
if enabled {
|
||||
if let Some(parent) = path.parent() {
|
||||
let _ = std::fs::create_dir_all(parent);
|
||||
}
|
||||
let _ = std::fs::write(&path, "");
|
||||
} else {
|
||||
let _ = std::fs::remove_file(&path);
|
||||
}
|
||||
}
|
||||
|
||||
/// Load set of trained response timestamps (nanos since epoch).
|
||||
pub fn load_trained() -> HashSet<i64> {
|
||||
let path = trained_path();
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue