consciousness/training/apollo_plugin
Kent Overstreet a73bcf5ae3 training: restructure as vLLM plugin package
- Convert to installable package with entry points for vLLM auto-discovery
- Add checkpoint_sync.py: Python replacement for Rust checkpoint binary
  - Block-level diffing of safetensors files (4KB blocks)
  - vLLM→HF weight name conversion built-in
  - Scheduled 10min after training jobs (batched)
- API change: /train now takes raw token IDs (context_ids + continuation_ids)
  - No tokenizer on training side, client owns tokenization
- Remove superseded code: standalone scripts, Rust binary, tokenizer helpers

Install: pip install -e ./training
Then vLLM auto-loads via entry point.

Co-Authored-By: Proof of Concept <poc@bcachefs.org>
2026-04-15 23:16:53 -04:00
..
__init__.py training: restructure as vLLM plugin package 2026-04-15 23:16:53 -04:00
checkpoint_sync.py training: restructure as vLLM plugin package 2026-04-15 23:16:53 -04:00
export_hook.py training: restructure as vLLM plugin package 2026-04-15 23:16:53 -04:00
optimizer.py training: restructure as vLLM plugin package 2026-04-15 23:16:53 -04:00
steering.py training: restructure as vLLM plugin package 2026-04-15 23:16:53 -04:00
weight_mapping.py training: restructure as vLLM plugin package 2026-04-15 23:16:53 -04:00
worker.py training: restructure as vLLM plugin package 2026-04-15 23:16:53 -04:00