AMoE: Agglomerative Mixture-of-Experts Vision Foundation Model Paper • 2512.20157 • Published Dec 23, 2025 • 1
Falcon-H1-Tiny Collection A series of extremely small, yet powerful language models redefining capabilities at small scale • 22 items • Updated 28 days ago • 35
Learnable Multipliers: Freeing the Scale of Language Model Matrix Layers Paper • 2601.04890 • Published Jan 8 • 42
view article Article Introducing Falcon-H1-Arabic: Pushing the Boundaries of Arabic Language AI with Hybrid Architecture Jan 5 • 38