How to use from the
Use from the
Transformers library
# Load model directly
from transformers import AutoTokenizer, CLIPTextModelWithProjection

tokenizer = AutoTokenizer.from_pretrained("peft-internal-testing/tiny-clip-text-2")
model = CLIPTextModelWithProjection.from_pretrained("peft-internal-testing/tiny-clip-text-2")
Quick Links

No model card

Downloads last month
573,349
Safetensors
Model size
69.5k params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using peft-internal-testing/tiny-clip-text-2 1