Running 1 Sensitivity-Aware Training (SAT) 📄 1 Train LLMs to be quantization‑ready with sensitivity‑aware methods
Running 1 SAKD: SWAN-Guided Knowledge Distillation 📖 1 Generate quantization‑ready student models via guided distillation