Text Classification
Transformers
TensorBoard
Safetensors
bert
Generated from Trainer
text-embeddings-inference
Instructions to use abbassix/2d6_1600 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use abbassix/2d6_1600 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="abbassix/2d6_1600")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("abbassix/2d6_1600") model = AutoModelForSequenceClassification.from_pretrained("abbassix/2d6_1600") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 07d1f503a2f60b377364c8b234a71a1666f8bf77feb92273a96995296ad2008e
- Size of remote file:
- 4.66 kB
- SHA256:
- b3c82c8813a16a9991d74bd8f3ec6dccb15eacf2ec0b5d543b99bd48841f0bd8
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.