TeichAI/GLM-4.7-Flash-Claude-Opus-4.5-High-Reasoning-Distill-GGUF 30B • Updated 13 days ago • 14.8k • 122
Running 81 Unlocking On-Policy Distillation for Any Model Family 📝 81 Improve model performance by transferring knowledge between different model families
Building on CPU Upgrade Featured 2.95k The Smol Training Playbook 📚 2.95k The secrets to building world-class LLMs
POINTS-Reader: Distillation-Free Adaptation of Vision-Language Models for Document Conversion Paper • 2509.01215 • Published Sep 1, 2025 • 51