tokie

gemma-2-2b

Pre-built tokie tokenizer for google/gemma-2-2b.

Quick Start (Python)

pip install tokie
import tokie

tokenizer = tokie.Tokenizer.from_pretrained("tokiers/gemma-2-2b")
encoding = tokenizer.encode("Hello, world!")
print(encoding.ids)
print(encoding.attention_mask)

Quick Start (Rust)

[dependencies]
tokie = { version = "0.0.7", features = ["hf"] }
use tokie::Tokenizer;

let tokenizer = Tokenizer::from_pretrained("tokiers/gemma-2-2b").unwrap();
let encoding = tokenizer.encode("Hello, world!", true);
println!("{:?}", encoding.ids);

Files

  • tokenizer.tkz โ€” tokie binary format (~10x smaller, loads in ~5ms)
  • tokenizer.json โ€” original HuggingFace tokenizer

About tokie

50x faster tokenization, 10x smaller model files, 100% accurate.

tokie is a drop-in replacement for HuggingFace tokenizers, built in Rust. See GitHub for benchmarks and documentation.

License

MIT OR Apache-2.0 (tokie library). Original model files retain their original license from google/gemma-2-2b.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support