Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
QuantTrio
/
MiniMax-M2-REAP-162B-A10B-AWQ
like
2
Follow
QuantTrio
148
Text Generation
Transformers
Safetensors
minimax_m2
vLLM
AWQ
conversational
custom_code
4-bit precision
awq
arxiv:
2510.13999
License:
mit
Model card
Files
Files and versions
xet
Community
1
Deploy
Use this model
New discussion
New pull request
Resources
PR & discussions documentation
Code of Conduct
Hub documentation
All
Discussions
Pull requests
View closed (0)
Sort: Recently created
Can you provide the awq quantization model for minimax-m2-reap-172b-a10b?
#1 opened about 17 hours ago by
fanhed