consciousness/training
Kent Overstreet 2f08149fab /finetune: expose all Apollo optimizer settings
lr, rank, betas, eps, weight_decay, warmup_steps,
scale, proj_refresh, norm_growth_limit — all optional
with sensible defaults.

Co-Authored-By: Proof of Concept <poc@bcachefs.org>
2026-04-15 23:19:22 -04:00
..
apollo_plugin /finetune: expose all Apollo optimizer settings 2026-04-15 23:19:22 -04:00
research research: latent reasoning integration plans for Qwen 3.5 27B 2026-04-12 15:50:09 -04:00
DESIGN.md DESIGN.md: complete rewrite reflecting validated architecture 2026-03-31 00:42:53 -04:00
pyproject.toml training: restructure as vLLM plugin package 2026-04-15 23:16:53 -04:00