Feature Extraction
Transformers
Safetensors
qwen
llama-factory
freeze
Generated from Trainer
custom_code
Instructions to use mohit95559/mymodel with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use mohit95559/mymodel with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="mohit95559/mymodel", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("mohit95559/mymodel", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 3e65513f1d1afb7b4e900fbbe980ba00a499c80f92c5400b82e1ffdaeaa0bd69
- Size of remote file:
- 6.97 kB
- SHA256:
- 14575c65fcf8bac7a2564e9633d494e3c834c5da256b378c1bec3b1c31129574
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.