Instructions to use Sayan01/stsb-distillbert-Direct with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Sayan01/stsb-distillbert-Direct with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Sayan01/stsb-distillbert-Direct")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("Sayan01/stsb-distillbert-Direct") model = AutoModelForSequenceClassification.from_pretrained("Sayan01/stsb-distillbert-Direct") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 78c1bc1e25775fe3c291b6801ab908b4290eba6c9e783f8ad19660cf628c5e19
- Size of remote file:
- 268 MB
- SHA256:
- 1073a0ad4d3aabbce97c2f6f0e56634feb7fcf61c817d5687c00a83c7949e9e9
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.