Zen4 Storm

Zen4 Storm is a 456B MoE (45B active) parameter language model from the Zen4 family by Zen LM and Hanzo AI.

Hybrid MoE with Lightning Attention for ultra-long context reasoning.

Model Details

Property Value
Parameters 456B MoE total, 45B active
Architecture Zen4 Frontier
Context 1M tokens
License MIT
Family Zen4
Tier Frontier
Creator Zen LM / Hanzo AI

Weights

Weights hosted at MiniMaxAI/MiniMax-M1-80k due to storage constraints. Use the source repository for inference and fine-tuning.

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("MiniMaxAI/MiniMax-M1-80k", torch_dtype="auto")
tokenizer = AutoTokenizer.from_pretrained("MiniMaxAI/MiniMax-M1-80k")

Links


Zen AI: Clarity Through Intelligence

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for zenlm/zen4-storm

Finetuned
(3)
this model