README / README.md
hzeng412's picture
Update README.md
e548842 verified
metadata
title: README
emoji: 🔥
colorFrom: pink
colorTo: indigo
sdk: static
pinned: false

Moxin LM: From SOTA Research to Efficient Deployment

  • Open Creation: The Moxin-7B series is our truly open, SOTA-performing LLM and VLM. We build, fine-tune, and openly release our own models.

  • Efficient Deployment: We specialize in extreme quantization, creating resource-efficient variants of popular models (like DeepSeek and Kimi) to run anywhere.

We unleash the power of reproducible AI 🚀. Explore our models below and on GitHub, and read our research on Moxin 7B (Open Creation) and MoE Compression (Efficient Deployment).