lthn/LEM-research
Preview • Updated • 123
How to use LetheanNetwork/lemma-bk with MLX:
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir lemma-bk LetheanNetwork/lemma-bk
How to use LetheanNetwork/lemma-bk with Transformers:
# Load model directly
from transformers import AutoProcessor, AutoModelForImageTextToText
processor = AutoProcessor.from_pretrained("LetheanNetwork/lemma-bk")
model = AutoModelForImageTextToText.from_pretrained("LetheanNetwork/lemma-bk")A Gemma 4 E4B finetune by lthn.ai — EUPL-1.2
MLX: 4bit, 5bit, 6bit, 8bit, bf16, mxfp4, mxfp8, nvfp4
Lemer (E2B) · Lemma (E4B) · Lemmy (26B) · Lemrd (31B)
Training data and adapter: EUPL-1.2 Base model: Apache 2.0
4-bit
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir lemma-bk LetheanNetwork/lemma-bk