Run this on your Mac with Outlier — a free macOS app for local MLX inference.

Phi-4-mini-instruct (MLX 4-bit)

MLX 4-bit conversion of microsoft/Phi-4-mini-instruct. License and base-model fields inherit from the original — see YAML frontmatter above.

Load with mlx-lm

pip install mlx-lm
python -m mlx_lm.generate --model Outlier-Ai/Phi-4-mini-instruct-MLX-4bit --prompt "Hello" --max-tokens 256

What is Outlier?

A free macOS app that runs MLX models locally — no cloud, no API keys, no usage caps.

outlier.host

Other Outlier conversions

License

Inherits from upstream (mit). See base model card.

Downloads last month
344
Safetensors
Model size
0.6B params
Tensor type
BF16
·
U32
·
MLX
Hardware compatibility
Log In to add your hardware

4-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Outlier-Ai/Phi-4-mini-instruct-MLX-4bit

Quantized
(147)
this model

Collection including Outlier-Ai/Phi-4-mini-instruct-MLX-4bit