You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Mistral-7B β€” fraQtl Compressed

14.48 GB β†’ 9.84 GB. Near-zero quality loss.

Metric Value
Original size 14.48 GB
Compressed size 9.84 GB
PPL delta (wikitext-2) +0.35
Source model mistralai/Mistral-7B-Instruct-v0.2
NIAH retrieval 3/3 preserved

What is this?

Mistral-7B-Instruct compressed with fraQtl. Same architecture, smaller files, near-zero quality loss.

Try it live

fraQtl Demo β€” this model running with KV cache compression on top.

Access

Weights are gated. For access or integration support: contact@fraqtl.ai

Reference

Paper: arXiv:2604.11501 | Website: fraqtl.ai

Downloads last month
122
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Space using fraQtl/Mistral-7B-compressed 1

Paper for fraQtl/Mistral-7B-compressed